机器学习工程师纳米学位

毕业项目:DeepTesla

2017年11月11日

说明:这个Jupter Notebook 文件包含了项目实现的代码以及相关注释。按照设计思路,将整个项目分为四个部分,分诉如下:

  • Part 1: 数据分析与探索。这一步骤主要是读取数据(包括csv文件和视频文件)、分析数据(如转向角度的数据分布等)。
  • Part 2: 数据预处理。在这部分中将前九段视频数据(转化为图片后)和对应转向角度及图片路径(转化为csv文件)作为训练数据(将划分为训练集和验证集)存储到硬盘中,将第十段视频(转化为图片后)和对应转向角度及图片路径(转化为csv文件)作为测试集存储到硬盘中。对训练数据和测试数据运用数据简化技术:剪切及重设尺寸;使用数据增加技术(之后在每个模型生成前,使用python data generator(生成器)来进行数据的分批次从硬盘导入,同时使用数据增加的方法。):水平翻转和增加噪音等。尝试使用VAE+GAN增加转向角度较大的图像作为训练数据。
  • Part 3: 利用Keras和Tensorflow训练模型并导出模型参数。使用基准模型对训练集进行训练,并通过验证集和测试集评测模型表现。期间涉及到技术有:卷积网络和全连接网络搭建与调参,迁移学习网络搭建与参数固定和调参,RELU和ELU激活函数运用,运用Droupout层防止过拟合等。最后将训练好模型和参数保存为model.json和model.h5两个文件。
  • Part 4: 选择最佳模型并生成最终视频。使用时间序列图对比各模型表现,选择最终模型。并使用最终模型在测试视频epoch10_front.mkv上测试,评估人工转向与模型转向误差。最后对测试视频epoch10_front.mkv生成结果视频并保存在./output中。


Part 1: 数据分析与探索



导入相关库文件

In [1]:
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
import matplotlib.image as mpimg
import os
import cv2
import glob
import random
%matplotlib inline


1.视频帧初探

In [4]:
### 载入前置摄像头图像

frame_capture = cv2.VideoCapture('./epochs/epoch01_front.mkv')
ret, img_test = frame_capture.read()
frame_capture.release()

plt.imshow(img_test)
Out[4]:
<matplotlib.image.AxesImage at 0x297d3cfa0f0>
In [5]:
### 查看图片尺寸

img_test.shape
Out[5]:
(720, 1280, 3)
In [11]:
### 视频不同场景截图

# 读取所有视频截图图片(不同场景)并显示出来
test_images = glob.glob('./images/Video_Snapshots/*.jpg')
fig, axs = plt.subplots(3, 3, figsize=(16,14))
fig.subplots_adjust(hspace = .004, wspace=.002)
axs = axs.ravel()

for i, im in enumerate(test_images):
    axs[i].imshow(mpimg.imread(im))
    axs[i].set_title('video snapshots'+str(i+1)+'.jpg', fontsize=20)
    axs[i].axis('off')
# 显示所有图片 
plt.tight_layout()
plt.show()


2.转向角度分析

In [13]:
### 加载所有转向角度csv文件

csv_files = glob.glob('./epochs/*steering.csv')
csv_steers = pd.concat((pd.read_csv(f) for f in csv_files))

csv_steers.head()
Out[13]:
frame frame_index ts_micro wheel
0 0.0 NaN 1464650070285914 -1.0
1 1.0 NaN 1464650070319247 -1.0
2 2.0 NaN 1464650070352581 -1.0
3 3.0 NaN 1464650070385914 -1.0
4 4.0 NaN 1464650070419247 -1.0
In [14]:
### 查看转向角度统计信息

csv_steers.describe()
Out[14]:
frame frame_index ts_micro wheel
count 5400.000000 21600.000000 2.700000e+04 27000.000000
mean 1616.166667 1349.500000 1.464374e+15 -0.338667
std 1120.991308 779.440853 1.382617e+11 4.438301
min 0.000000 0.000000 1.464304e+15 -18.000000
25% 674.750000 674.750000 1.464304e+15 -2.500000
50% 1349.500000 1349.500000 1.464305e+15 0.000000
75% 2549.250000 2024.250000 1.464306e+15 1.500000
max 3899.000000 2699.000000 1.464650e+15 15.000000
In [15]:
### 直方图查看'wheel'方向角度数据分布

wheel = csv_steers['wheel']
plt.figure
plt.grid(True)
plt.hist(wheel,100);
plt.xlabel('wheel')
plt.ylabel('counts')
plt.savefig('./images/img/Hist.png')
In [25]:
### 查看转向角度的时间序列图

#生成时间序列图(2700帧,90秒)
plt.figure;
plt.plot(np.arange( 0, 2700),wheel[:2700]);
plt.xlabel('Frame Counts')
plt.ylabel('Steering Angles')
plt.grid(True)
plt.savefig('./images/img/Frame_angles.png')
plt.show()


3.视频帧和转向角度结合

In [27]:
### 读取'epoch03_front.mkv'和对应'epoch03_steering.csv'文件,

frame_capture3 = cv2.VideoCapture('./epochs/epoch03_front.mkv')

# 抓取到内存中
epoch3_imgs = []
while True:
    ret, img = frame_capture3.read()
    if not ret:
        break
    epoch3_imgs.append(img)
print ("The epoch03 images' length is:",len(epoch3_imgs))

# 读入对应csv文件
csv_wheel3 = pd.read_csv('./epochs/epoch03_steering.csv')
steering_wheel3 = csv_wheel3['wheel'].values
frame_wheel3 = csv_wheel3['frame_index'].values
print("The epoch03 wheel length is", len(steering_wheel3))
The epoch03 images' length is: 2700
The epoch03 wheel length is 2700
In [28]:
### 随机查看帧图片和转向策略

# 随机查看9张图片和转向角度
fig, axs = plt.subplots(3, 3, figsize=(16,14))
fig.subplots_adjust(hspace = 0, wspace=0)
axs = axs.ravel()
for i in range(9):
    index = random.choice(range(len(epoch3_imgs)))
    img = epoch3_imgs[index]
    #ax = fig.add_subplot(3,3,i+1) 
    axs[i].imshow(img)
    axs[i].set_title('frame:'+ str(frame_wheel3[index]) + '  wheel:' + str(steering_wheel3[index]), fontsize=20)
    

# 显示所有图片  
plt.tight_layout()
plt.savefig('./images/img/frame_wheel.png')
plt.show()


Part 2: 数据预处理



导入相关库文件

说明:每个part开始时,重启Kernel

In [3]:
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
import cv2
import random
import os
import csv
import tensorflow as tf
import pickle


# 导入自定义数据处理文件'preprocess_data'
import preprocess_data

# 导入自定义'vae_gan'
import vae_gan
z_dim = 512
VG_PATH = './vg_images/'

%matplotlib inline
Using TensorFlow backend.


1.单张图片处理测试

In [22]:
### <1>读取测试图片

image_test = cv2.imread('./images/img/frame_1173.jpg')
image_test = cv2.cvtColor(image_test,cv2.COLOR_BGR2RGB)
print(image_test.shape)
plt.imshow(image_test)
(720, 1280, 3)
Out[22]:
<matplotlib.image.AxesImage at 0x206bc3415f8>
In [23]:
### <2>裁剪掉天空部分,并重设图片尺寸大小为80*80,后续用于数据选取简化(可加快模型收敛速度)

image_test_crop = image_test[300:720, 0:1280]
image_test_resize = cv2.resize(image_test_crop, (80, 80))
print(image_test_resize.shape)
plt.imshow(image_test_resize)
(80, 80, 3)
Out[23]:
<matplotlib.image.AxesImage at 0x206bb86d4a8>
In [24]:
### <3>水平翻转图片,图片数据归一化,并给图片增加噪声,后续用于数据增加


#翻转图片
image_test_flip = cv2.flip(image_test_resize, 1)
#图片数据归一化
image_test_flip = 0.1 + (image_test_flip - 0)*(0.9 - 0.1) / 255.0
#增加噪声
noisy_img = image_test_flip + 0.05 * np.random.randn(*image_test_flip.shape)
noisy_img = np.clip(noisy_img, 0.1, 0.9)
plt.imshow(noisy_img)
Out[24]:
<matplotlib.image.AxesImage at 0x206bb8d40f0>


2.原始训练数据与测试数据生成

In [2]:
### <1>使用 'preprocess_data' 中’load_data'将所有mkv文件转化为图像(rgb jpg格式)装入内存返回和在硬盘持久化
###(分别保存在./train_images和./test_images文件夹中)
###  并将所有对应转向角度和图片路径装入内存返回和在硬盘持久化(分别保存在./train_images和./test_images中csv文件)

#使用"1.单张图片处理测试"中的数据简化(裁剪,重设尺寸)生成基本数据
train_imgs, train_wheels = preprocess_data.load_data('train',write_to_disk=True)
test_imgs, test_wheels = preprocess_data.load_data('test',write_to_disk=True)
load data start!
The epoch1 mkv is processing
The epoch2 mkv is processing
The epoch3 mkv is processing
The epoch4 mkv is processing
The epoch5 mkv is processing
The epoch6 mkv is processing
The epoch7 mkv is processing
The epoch8 mkv is processing
The epoch9 mkv is processing
loading data filished!
load data start!
The epoch10 mkv is processing
loading data filished!
In [3]:
### <2>测试生成的训练和测试图片的shape

print("The shape of train images is", train_imgs.shape)
print("The shape of train wheels", train_wheels.shape)
print("The shape of test images is", test_imgs.shape)
print("The shape of test wheels", test_wheels.shape)
The shape of train images is (24300, 80, 80, 3)
The shape of train wheels (24300, 1)
The shape of test images is (2700, 80, 80, 3)
The shape of test wheels (2700, 1)
In [4]:
### <3>测试保存在硬盘上的图片和csv文件

f, (ax1,ax2) = plt.subplots(1, 2, figsize=(20, 10))
f.tight_layout()
img_train = plt.imread('./train_images/epoch01_steering_0.jpg')
csv_train = pd.read_csv('./train_images/train_images.csv')
ax1.imshow(img_train)
ax1.set_title(csv_train['img_path'][0]+'    wheel:'+str(csv_train['wheel'][0]), fontsize=15)
img_test = plt.imread('./test_images/epoch10_steering_0.jpg')
csv_test = pd.read_csv('./test_images/test_images.csv')
ax2.imshow(img_test)
ax2.set_title(csv_test['img_path'][0]+'    wheel:'+str(csv_test['wheel'][0]), fontsize=15)
Out[4]:
<matplotlib.text.Text at 0x279c17ad898>
In [1]:
### <4>对整体的数据处理策略为:
### 使用 python data generator(生成器)来进行数据的分批次从硬盘导入,同时使用数据增加的方法,对每批次进行数据增加处理(翻转和增加噪音)
### 在使用完数据增加后,数据总体将是原来的3倍
### 具体代码见文件'./preprocess_data.py中的'generator'函数

import preprocess_data
from sklearn.model_selection import train_test_split

#载入训练数据
#之后每个模型单独进行 训练数据和验证数据生成
samples = preprocess_data.read_csv('./train_images/train_images.csv')
train_samples, validation_samples = train_test_split(samples, test_size=0.2)
train_generator = preprocess_data.generator(train_samples, batch_size=128)
validation_generator = preprocess_data.generator(validation_samples, batch_size=128)
In [2]:
# 测试'generator'
my_output = []
for i in range(10):
    my_output = (next(train_generator))
print(my_output[0].shape)
print(my_output[1].shape)
(384, 80, 80, 3)
(384,)


3.使用 GAN+VAE 生成训练用数据(选做)

说明

In [4]:
### <1>建立存储中间过程文件夹

if not os.path.exists("./outputs/results_"+"autoencoder"):
    os.makedirs("./outputs/results_"+"autoencoder")
if not os.path.exists("./outputs/samples_"+"autoencoder"):
    os.makedirs("./outputs/samples_"+"autoencoder")
In [5]:
### <2>定义模型的训练生成器
### 提取转向角度小于-5或大于5的图片为训练数据(从第一部分直方图查看'wheel'方向角度数据分布中可以看出,这些数据较少)

def read_csv():
    """
    # Reading the csv file and store in the list
    """
    samples_original = []
    samples_steering = []
    with open('./train_images/train_images.csv') as csvfile:
        reader = csv.reader(csvfile)
        for line in reader:
            samples_original.append(line)
    samples_original = samples_original[1:]
        
    # 提取转向角度小于-5或大于5的sample
    for sample in samples_original:
        if (float(sample[0]) < -5.0) or (float(sample[0]) > 5.0):
            samples_steering.append(sample)
    
    #增加转向幅度大的sample
    while True:
        if len(samples_steering) < 8000:
            #print(len(samples_steering))
            index = random.choice(range(len(samples_steering)))
            samples_steering.append(samples_steering[index])
        else:
            break
        
    return samples_steering


def generator_train(samples, batch_size=64):
    """
    # The generator of the samples
    """
    num_samples = len(samples)
    while True: #循环不停生成
        for offset in range(0, num_samples, batch_size):
            batch_samples = samples[offset:offset+batch_size]
            X_train = []
            for sample in batch_samples:
                X = cv2.imread(sample[1])
                X = cv2.resize(X, (160, 80))
                X = cv2.cvtColor(X,cv2.COLOR_BGR2RGB)
                X = X/127.5 - 1.
                X_train.append(X)
                
                
            #这里是无监督学习,不使用wheels, 使用Z作为随机输入
            X_train = np.array(X_train)
            Z = np.random.normal(0, 1, (X_train.shape[0], z_dim))
            yield Z, X_train

# 读取csv文件每一行作为一个sample
train_samles = read_csv()
#train_samles = train_samles[0:8000]
print(np.array(train_samles).shape)

# 测试'generator_train'
my_output = []
for i in range(10):
    my_output = (next(generator_train(train_samles)))
print(my_output[0].shape)
print(my_output[1].shape)
(8000, 2)
(64, 512)
(64, 80, 160, 3)
In [6]:
### <3>模型训练生成图片

with tf.Session() as sess:
    g_train, d_train, sampler, saver, loader, extras = vae_gan.get_model(sess=sess, name="autoencoder", batch_size=64, gpu=0)
    sampler_train = vae_gan.train_model("autoencoder", g_train, d_train, sampler,
                generator_train(train_samles),
                samples_per_epoch=8000,
                nb_epoch=20, verbose=1, saver=saver
                )
    
    # 训练完后使用生成图片
    new_samples = []
    for i in range(8000//64):
        output = (next(generator_train(train_samles)))
        new_sample, _ = sampler(output[0], output[1])
        new_samples.append(new_sample)
D:\Anaconda3\envs\carnd-term1\lib\site-packages\keras\engine\topology.py:379: UserWarning: The `regularizers` property of layers/models is deprecated. Regularization losses are now managed via the `losses` layer/model property.
  warnings.warn('The `regularizers` property of layers/models '
G.shape:  (64, 80, 160, 3)
E.shape:  [(64, 512), (64, 512)]
D.shape:  [(64, 1), (64, 5, 10, 512)]
Generator variables:
autoencoder/g_h0_lin_W:0
autoencoder/g_h0_lin_b:0
autoencoder/g_bn0_gamma:0
autoencoder/g_bn0_beta:0
autoencoder/g_h1/w:0
autoencoder/g_h1/biases:0
autoencoder/g_bn1_gamma:0
autoencoder/g_bn1_beta:0
autoencoder/g_h2/w:0
autoencoder/g_h2/biases:0
autoencoder/g_bn2_gamma:0
autoencoder/g_bn2_beta:0
autoencoder/g_h3/w:0
autoencoder/g_h3/biases:0
autoencoder/g_bn3_gamma:0
autoencoder/g_bn3_beta:0
autoencoder/g_h4/w:0
autoencoder/g_h4/biases:0
Discriminator variables:
autoencoder/d_h0_conv_W:0
autoencoder/d_h0_conv_b:0
autoencoder/d_h1_conv_W:0
autoencoder/d_h1_conv_b:0
autoencoder/d_bn1_gamma:0
autoencoder/d_bn1_beta:0
autoencoder/d_h2_conv_W:0
autoencoder/d_h2_conv_b:0
autoencoder/d_bn2_gamma:0
autoencoder/d_bn2_beta:0
autoencoder/d_h3_conv_W:0
autoencoder/d_h3_conv_b:0
autoencoder/d_bn3_gamma:0
autoencoder/d_bn3_beta:0
autoencoder/d_h3_lin_W:0
autoencoder/d_h3_lin_b:0
Encoder variables:
autoencoder/e_h0_conv_W:0
autoencoder/e_h0_conv_b:0
autoencoder/e_h1_conv_W:0
autoencoder/e_h1_conv_b:0
autoencoder/e_bn1_gamma:0
autoencoder/e_bn1_beta:0
autoencoder/e_h2_conv_W:0
autoencoder/e_h2_conv_b:0
autoencoder/e_bn2_gamma:0
autoencoder/e_bn2_beta:0
autoencoder/e_h3_conv_W:0
autoencoder/e_h3_conv_b:0
autoencoder/e_bn3_gamma:0
autoencoder/e_bn3_beta:0
autoencoder/e_h3_lin_W:0
autoencoder/e_h3_lin_b:0
autoencoder/e_h4_lin_W:0
autoencoder/e_h4_lin_b:0
Epoch 1/20
8000/8000 [==============================] - 252s - g_loss: 1.8224 - d_loss: 1.8719 - d_loss_fake: 1.1442 - d_loss_legit: 0.7277 - time: 1.8067   
Epoch 2/20
8000/8000 [==============================] - 249s - g_loss: 1.9424 - d_loss: 2.7553 - d_loss_fake: 1.4145 - d_loss_legit: 1.3408 - time: 1.7958   
Epoch 3/20
8000/8000 [==============================] - 249s - g_loss: 2.4958 - d_loss: 2.5902 - d_loss_fake: 1.2287 - d_loss_legit: 1.3615 - time: 1.7970   
Epoch 4/20
8000/8000 [==============================] - 249s - g_loss: 2.7573 - d_loss: 2.1233 - d_loss_fake: 0.8719 - d_loss_legit: 1.2514 - time: 1.7951   
Epoch 5/20
8000/8000 [==============================] - 253s - g_loss: 2.9694 - d_loss: 1.7823 - d_loss_fake: 0.7848 - d_loss_legit: 0.9975 - time: 1.8014   
Epoch 6/20
8000/8000 [==============================] - 252s - g_loss: 3.1063 - d_loss: 1.7980 - d_loss_fake: 0.7422 - d_loss_legit: 1.0557 - time: 1.8025   
Epoch 7/20
8000/8000 [==============================] - 252s - g_loss: 2.9509 - d_loss: 1.7951 - d_loss_fake: 0.8246 - d_loss_legit: 0.9705 - time: 1.8015   
Epoch 8/20
8000/8000 [==============================] - 253s - g_loss: 3.1742 - d_loss: 1.7414 - d_loss_fake: 0.7171 - d_loss_legit: 1.0244 - time: 1.8048   
Epoch 9/20
8000/8000 [==============================] - 251s - g_loss: 2.8747 - d_loss: 1.5559 - d_loss_fake: 0.6708 - d_loss_legit: 0.8850 - time: 1.8055   
Epoch 10/20
8000/8000 [==============================] - 252s - g_loss: 2.9815 - d_loss: 1.5117 - d_loss_fake: 0.6822 - d_loss_legit: 0.8295 - time: 1.8043   
Epoch 11/20
8000/8000 [==============================] - 250s - g_loss: 2.9477 - d_loss: 1.3911 - d_loss_fake: 0.5788 - d_loss_legit: 0.8123 - time: 1.8034   
Epoch 12/20
8000/8000 [==============================] - 250s - g_loss: 2.7813 - d_loss: 1.3522 - d_loss_fake: 0.5656 - d_loss_legit: 0.7866 - time: 1.8010   
Epoch 13/20
8000/8000 [==============================] - 250s - g_loss: 2.9709 - d_loss: 1.3922 - d_loss_fake: 0.6117 - d_loss_legit: 0.7805 - time: 1.8029   
Epoch 14/20
8000/8000 [==============================] - 251s - g_loss: 2.9065 - d_loss: 1.2279 - d_loss_fake: 0.5358 - d_loss_legit: 0.6920 - time: 1.7979   
Epoch 15/20
8000/8000 [==============================] - 252s - g_loss: 3.2203 - d_loss: 1.2404 - d_loss_fake: 0.5362 - d_loss_legit: 0.7042 - time: 1.8021   
Epoch 16/20
8000/8000 [==============================] - 252s - g_loss: 3.0949 - d_loss: 1.2104 - d_loss_fake: 0.5228 - d_loss_legit: 0.6876 - time: 1.8023   
Epoch 17/20
8000/8000 [==============================] - 251s - g_loss: 3.0838 - d_loss: 1.1478 - d_loss_fake: 0.4915 - d_loss_legit: 0.6563 - time: 1.8037   
Epoch 18/20
8000/8000 [==============================] - 252s - g_loss: 3.2523 - d_loss: 1.0994 - d_loss_fake: 0.4588 - d_loss_legit: 0.6405 - time: 1.8057   
Epoch 19/20
8000/8000 [==============================] - 251s - g_loss: 3.0428 - d_loss: 1.1564 - d_loss_fake: 0.5095 - d_loss_legit: 0.6469 - time: 1.8046   
Epoch 20/20
8000/8000 [==============================] - 251s - g_loss: 3.1535 - d_loss: 0.9698 - d_loss_fake: 0.4197 - d_loss_legit: 0.5501 - time: 1.8046   
In [7]:
### <4>保存和查看生成图片
### 图中奇数列(从1开始计数)为生成图片,偶数列(从2开始)为原始图片。生成图片基本呈现了原始图片特征并有细微差别(转向角度大体没变),
### 后面尝试使用训练
### 原始模型进行了200次训练,这里因时间原因和练习目的仅训练20次。后期可增加训练次数进行优化

# 保存新生成图像和相关信息
pickle.dump(new_samples, open('./images/vg_images.p', 'wb'))
pickle.dump(train_samles, open('./images/train_samles.p', 'wb'))

#查看新生成图片
image_test = cv2.imread('./outputs/samples_autoencoder/train_19_0.png')
image_test = cv2.cvtColor(image_test,cv2.COLOR_BGR2RGB)
plt.figure(figsize=(15,15)) 
plt.imshow(image_test)
Out[7]:
<matplotlib.image.AxesImage at 0x25e8b8667f0>
In [8]:
### <5>载入新生成图片和相关信息并测试

# 载入数据
imgs_data = pickle.load(open('./images/vg_images.p', 'rb'))
wheels_data = pickle.load(open('./images/train_samles.p', 'rb'))

# 查看图片
new_images = np.array(imgs_data).reshape((-1,80,160,3))
print(new_images.shape)
print(np.array(wheels_data).shape)
new_images = vae_gan.inverse_transform(new_images)
plt.imshow(new_images[799])
(8000, 80, 160, 3)
(8000, 2)
Out[8]:
<matplotlib.image.AxesImage at 0x25e6da7b940>
In [9]:
### <6>将图片(以rgb jpg格式)和转向角度及图片路径(以csv文件)保存到硬盘'./vg_images'文件夹中

preprocess_data.write_vg_images(new_images, wheels_data)
Write vg images success!
In [10]:
### <7>测试保存在硬盘上新生成csv文件和原文件对比

f, (ax1,ax2) = plt.subplots(1, 2, figsize=(15, 10))
f.tight_layout()
img_vg = plt.imread('./vg_images/vg_images0.jpg')
csv_vg = pd.read_csv('./vg_images/vg_images.csv')
ax1.imshow(img_vg)
ax1.set_title('VG IMAGE:  '+csv_vg['img_path'][0]+'    wheel:'+str(csv_vg['wheel'][0]), fontsize=15)
img_orginal = plt.imread(wheels_data[0][1])
ax2.imshow(img_orginal)
ax2.set_title('ORIGINAL:  '+wheels_data[0][1]+'    wheel:'+wheels_data[0][0], fontsize=15)
Out[10]:
<matplotlib.text.Text at 0x25e6661fdd8>

数据处理方式说明:之后在每个模型生成前,使用python data generator(生成器)来进行数据的分批次从硬盘导入(先读取csv文件,再根据文件中的路径读取图片),同时使用数据增加的方法,对每批次进行数据增加处理。


Part 3: 训练模型


说明:每个model开始时,重启Kernel


1.NVIDIA end-to-end model

(1) Benchmark (基准模型)

In [1]:
### <1>用generator 生成NVIDIA model训练数据

import preprocess_data
from sklearn.model_selection import train_test_split

#载入训练数据
samples = preprocess_data.read_csv('./train_images/train_images.csv')
train_samples, validation_samples = train_test_split(samples, test_size=0.2)
train_generator = preprocess_data.generator(train_samples, batch_size=128)
validation_generator = preprocess_data.generator(validation_samples, batch_size=128)
In [2]:
### <2>建立NVIDIA model

from keras.models import Sequential
from keras.layers import Convolution2D, Dropout, Flatten, Dense, Lambda

def Nvidia_model():
    """
    Nvidia model:referenced https://arxiv.org/pdf/1604.07316v1.pdf
    """
    print("Nvidia-------------")
    model = Sequential()
    model.add(Convolution2D(24,5,5,subsample=(2,2),activation='relu', input_shape=(80,80,3)))
    model.add(Convolution2D(36,5,5,subsample=(2,2),activation='relu'))
    model.add(Convolution2D(48,5,5,subsample=(2,2),activation='relu'))
    model.add(Convolution2D(64,3,3,activation='relu'))
    model.add(Convolution2D(64,3,3,activation='relu'))
    model.add(Flatten())
    model.add(Dense(1164, activation='relu'))
    model.add(Dense(100, activation='relu'))
    model.add(Dense(50, activation='relu'))
    model.add(Dense(10, activation='relu'))
    model.add(Dense(1))

    return model
Using TensorFlow backend.
In [3]:
Nvidia_model().summary()
Nvidia-------------
____________________________________________________________________________________________________
Layer (type)                     Output Shape          Param #     Connected to                     
====================================================================================================
convolution2d_1 (Convolution2D)  (None, 38, 38, 24)    1824        convolution2d_input_1[0][0]      
____________________________________________________________________________________________________
convolution2d_2 (Convolution2D)  (None, 17, 17, 36)    21636       convolution2d_1[0][0]            
____________________________________________________________________________________________________
convolution2d_3 (Convolution2D)  (None, 7, 7, 48)      43248       convolution2d_2[0][0]            
____________________________________________________________________________________________________
convolution2d_4 (Convolution2D)  (None, 5, 5, 64)      27712       convolution2d_3[0][0]            
____________________________________________________________________________________________________
convolution2d_5 (Convolution2D)  (None, 3, 3, 64)      36928       convolution2d_4[0][0]            
____________________________________________________________________________________________________
flatten_1 (Flatten)              (None, 576)           0           convolution2d_5[0][0]            
____________________________________________________________________________________________________
dense_1 (Dense)                  (None, 1164)          671628      flatten_1[0][0]                  
____________________________________________________________________________________________________
dense_2 (Dense)                  (None, 100)           116500      dense_1[0][0]                    
____________________________________________________________________________________________________
dense_3 (Dense)                  (None, 50)            5050        dense_2[0][0]                    
____________________________________________________________________________________________________
dense_4 (Dense)                  (None, 10)            510         dense_3[0][0]                    
____________________________________________________________________________________________________
dense_5 (Dense)                  (None, 1)             11          dense_4[0][0]                    
====================================================================================================
Total params: 925,047
Trainable params: 925,047
Non-trainable params: 0
____________________________________________________________________________________________________
In [4]:
### <3>NVIDIA model模型训练


model = Nvidia_model()
model.compile(loss='mse', optimizer='adam')
history_object = model.fit_generator(train_generator, samples_per_epoch= 
                                     len(train_samples)*3, validation_data=validation_generator, 
                                     nb_val_samples=len(validation_samples)*3,nb_epoch=10)
Nvidia-------------
Epoch 1/10
58320/58320 [==============================] - 41s - loss: 17.8832 - val_loss: 10.7253
Epoch 2/10
58320/58320 [==============================] - 38s - loss: 7.3714 - val_loss: 4.8491
Epoch 3/10
58320/58320 [==============================] - 36s - loss: 3.8005 - val_loss: 2.4994
Epoch 4/10
58320/58320 [==============================] - 37s - loss: 2.1176 - val_loss: 1.5325
Epoch 5/10
58320/58320 [==============================] - 37s - loss: 1.3325 - val_loss: 1.1606
Epoch 6/10
58320/58320 [==============================] - 37s - loss: 1.0151 - val_loss: 0.8715
Epoch 7/10
58320/58320 [==============================] - 38s - loss: 0.8035 - val_loss: 0.8606
Epoch 8/10
58320/58320 [==============================] - 38s - loss: 0.6615 - val_loss: 0.7174
Epoch 9/10
58320/58320 [==============================] - 37s - loss: 0.5866 - val_loss: 0.5787
Epoch 10/10
58320/58320 [==============================] - 36s - loss: 0.5361 - val_loss: 0.5013
In [5]:
### <4>分析结果
import preprocess_data
import matplotlib.pyplot as plt

#读取测试测试文件
test_imgs, test_wheels = preprocess_data.load_data('test')
test_imgs = preprocess_data.nomorlize_image(test_imgs)

# 测试模型在测试文件上表现
test_loss= model.evaluate(test_imgs, test_wheels, batch_size=128)
print('\n Test loss is:{}'.format(test_loss))

# 查看训练集和验证集loss
plt.plot(history_object.history['loss'])
plt.plot(history_object.history['val_loss'])
plt.title('Nvidia model mean squared error loss')
plt.ylabel('mean squared error loss')
plt.xlabel('epoch')
plt.legend(['training set', 'validation set'], loc='upper right')
plt.show()
load data start!
The epoch10 mkv is processing
loading data filished!
2700/2700 [==============================] - 1s     

 Test loss is:8.861750322977702
In [6]:
### <5>保存model
import os

# model文件和json文件存储路径
model_saved_path = os.path.join('./models/', "nvidia_model.h5")
json_saved_path = os.path.join('./models/', "nvidia_model.json")

# 存储json
json_model = model.to_json()
with open(json_saved_path, "w") as json_file:
    json_file.write(json_model)
    
# 存储model
model.save(model_saved_path)
In [14]:
### 选做:<6>增加vg+gan生成图片到训练数据

#载入训练数据
samples = preprocess_data.read_csv('./train_images/train_images.csv')
vg_samples = preprocess_data.read_csv('./vg_images/vg_images.csv')
train_samples, validation_samples = train_test_split(samples, test_size=0.2)
train_samples = train_samples + vg_samples
train_generator = preprocess_data.generator(train_samples, batch_size=128)
validation_generator = preprocess_data.generator(validation_samples, batch_size=128)


model_plus = Nvidia_model()
model_plus.compile(loss='mse', optimizer='adam')
history_object = model_plus.fit_generator(train_generator, samples_per_epoch= 
                                     len(train_samples)*3, validation_data=validation_generator, 
                                     nb_val_samples=len(validation_samples)*3,nb_epoch=5)
Nvidia-------------
Epoch 1/5
82320/82320 [==============================] - 56s - loss: 38.1551 - val_loss: 20.3620
Epoch 2/5
82320/82320 [==============================] - 53s - loss: 40.7791 - val_loss: 20.3723
Epoch 3/5
82320/82320 [==============================] - 53s - loss: 40.5858 - val_loss: 20.3775
Epoch 4/5
82320/82320 [==============================] - 53s - loss: 41.5649 - val_loss: 20.3687
Epoch 5/5
82320/82320 [==============================] - 53s - loss: 41.4422 - val_loss: 20.3613
In [15]:
# 测试模型在测试文件上表现
test_loss= model_plus.evaluate(test_imgs, test_wheels, batch_size=128)
test_imgs = preprocess_data.nomorlize_image(test_imgs)
print('Test loss is:{}'.format(test_loss))
2688/2700 [============================>.] - ETA: 0sTest loss is:7.497045198016696

分析:在加入vae+gan模型生成数据后,原始模型表现在训练集和验证集不能很好收敛,但在测试集上表现变好(说明模型识能力有所增强)。但总体来说还是生成图片模型vae+gan训练次数太少,所以后期不加入vae+gan模型生成数据,留待后续优化。

(2) Refined Model (改进模型:ELU, Batch normalization, Weight initial, MaxPooling, BatchNormalization, Dropout)

In [1]:
### <1>用generator 生成NVIDIA refined model训练数据

import preprocess_data
from sklearn.model_selection import train_test_split

#载入训练数据
samples = preprocess_data.read_csv('./train_images/train_images.csv')
train_samples, validation_samples = train_test_split(samples, test_size=0.2)
train_generator = preprocess_data.generator(train_samples, batch_size=128)
validation_generator = preprocess_data.generator(validation_samples, batch_size=128)
In [2]:
### <2>建立NVIDIA refined model

from keras.models import Sequential
from keras.layers import Convolution2D, Dropout, Flatten, Dense, Lambda, BatchNormalization, MaxPooling2D

def Nvidia_refined_model():
    """
    Nvidia refined model
    """
    print("Nvidia-------------")
    model = Sequential()
    model.add(Convolution2D(24,5,5,subsample=(2,2),activation='elu',border_mode='same',init='he_normal',input_shape=(80,80,3)))
    model.add(BatchNormalization())
    model.add(MaxPooling2D())  
    
    model.add(Convolution2D(36,5,5,subsample=(2,2),activation='elu',border_mode='same',init='he_normal'))
    model.add(BatchNormalization())
    model.add(MaxPooling2D())
    
    model.add(Convolution2D(48,5,5,subsample=(2,2),activation='elu',border_mode='same',init='he_normal'))
    model.add(BatchNormalization())
    model.add(MaxPooling2D())
    
    model.add(Convolution2D(64,3,3,activation='elu',border_mode='same',init='he_normal'))
    model.add(BatchNormalization())
    
    model.add(Convolution2D(64,3,3,activation='elu',border_mode='same',init='he_normal'))
    model.add(BatchNormalization())
    
    model.add(Flatten())
    model.add(Dense(1164, activation='elu',init='he_normal'))
    model.add(Dropout(0.5))
    
    model.add(Dense(100, activation='elu',init='he_normal'))
    model.add(Dropout(0.3))
    
    model.add(Dense(50, activation='elu',init='he_normal'))
    model.add(Dropout(0.2))
    
    model.add(Dense(10, activation='elu',init='he_normal'))
    model.add(Dropout(0.1))
    
    model.add(Dense(1))

    return model
Using TensorFlow backend.
In [3]:
Nvidia_refined_model().summary()
Nvidia-------------
____________________________________________________________________________________________________
Layer (type)                     Output Shape          Param #     Connected to                     
====================================================================================================
convolution2d_1 (Convolution2D)  (None, 40, 40, 24)    1824        convolution2d_input_1[0][0]      
____________________________________________________________________________________________________
batchnormalization_1 (BatchNorma (None, 40, 40, 24)    96          convolution2d_1[0][0]            
____________________________________________________________________________________________________
maxpooling2d_1 (MaxPooling2D)    (None, 20, 20, 24)    0           batchnormalization_1[0][0]       
____________________________________________________________________________________________________
convolution2d_2 (Convolution2D)  (None, 10, 10, 36)    21636       maxpooling2d_1[0][0]             
____________________________________________________________________________________________________
batchnormalization_2 (BatchNorma (None, 10, 10, 36)    144         convolution2d_2[0][0]            
____________________________________________________________________________________________________
maxpooling2d_2 (MaxPooling2D)    (None, 5, 5, 36)      0           batchnormalization_2[0][0]       
____________________________________________________________________________________________________
convolution2d_3 (Convolution2D)  (None, 3, 3, 48)      43248       maxpooling2d_2[0][0]             
____________________________________________________________________________________________________
batchnormalization_3 (BatchNorma (None, 3, 3, 48)      192         convolution2d_3[0][0]            
____________________________________________________________________________________________________
maxpooling2d_3 (MaxPooling2D)    (None, 1, 1, 48)      0           batchnormalization_3[0][0]       
____________________________________________________________________________________________________
convolution2d_4 (Convolution2D)  (None, 1, 1, 64)      27712       maxpooling2d_3[0][0]             
____________________________________________________________________________________________________
batchnormalization_4 (BatchNorma (None, 1, 1, 64)      256         convolution2d_4[0][0]            
____________________________________________________________________________________________________
convolution2d_5 (Convolution2D)  (None, 1, 1, 64)      36928       batchnormalization_4[0][0]       
____________________________________________________________________________________________________
batchnormalization_5 (BatchNorma (None, 1, 1, 64)      256         convolution2d_5[0][0]            
____________________________________________________________________________________________________
flatten_1 (Flatten)              (None, 64)            0           batchnormalization_5[0][0]       
____________________________________________________________________________________________________
dense_1 (Dense)                  (None, 1164)          75660       flatten_1[0][0]                  
____________________________________________________________________________________________________
dropout_1 (Dropout)              (None, 1164)          0           dense_1[0][0]                    
____________________________________________________________________________________________________
dense_2 (Dense)                  (None, 100)           116500      dropout_1[0][0]                  
____________________________________________________________________________________________________
dropout_2 (Dropout)              (None, 100)           0           dense_2[0][0]                    
____________________________________________________________________________________________________
dense_3 (Dense)                  (None, 50)            5050        dropout_2[0][0]                  
____________________________________________________________________________________________________
dropout_3 (Dropout)              (None, 50)            0           dense_3[0][0]                    
____________________________________________________________________________________________________
dense_4 (Dense)                  (None, 10)            510         dropout_3[0][0]                  
____________________________________________________________________________________________________
dropout_4 (Dropout)              (None, 10)            0           dense_4[0][0]                    
____________________________________________________________________________________________________
dense_5 (Dense)                  (None, 1)             11          dropout_4[0][0]                  
====================================================================================================
Total params: 330,023
Trainable params: 329,551
Non-trainable params: 472
____________________________________________________________________________________________________
In [4]:
### <3>NVIDIA refined model模型训练


model = Nvidia_refined_model()
model.compile(loss='mse', optimizer='adam')
history_object = model.fit_generator(train_generator, samples_per_epoch= 
                                     len(train_samples)*3, validation_data=validation_generator, 
                                     nb_val_samples=len(validation_samples)*3,nb_epoch=10)
Nvidia-------------
Epoch 1/10
58320/58320 [==============================] - 41s - loss: 12.3596 - val_loss: 16.3098
Epoch 2/10
58320/58320 [==============================] - 35s - loss: 4.8543 - val_loss: 7.3601
Epoch 3/10
58320/58320 [==============================] - 35s - loss: 3.2042 - val_loss: 2.5542
Epoch 4/10
58320/58320 [==============================] - 36s - loss: 2.5189 - val_loss: 1.3969
Epoch 5/10
58320/58320 [==============================] - 38s - loss: 2.1358 - val_loss: 1.1753
Epoch 6/10
58320/58320 [==============================] - 37s - loss: 1.8533 - val_loss: 1.0263
Epoch 7/10
58320/58320 [==============================] - 36s - loss: 1.7034 - val_loss: 1.3407
Epoch 8/10
58320/58320 [==============================] - 35s - loss: 1.5898 - val_loss: 0.8573
Epoch 9/10
58320/58320 [==============================] - 35s - loss: 1.4742 - val_loss: 0.7142
Epoch 10/10
58320/58320 [==============================] - 37s - loss: 1.4000 - val_loss: 0.8484
In [5]:
### <4>分析结果
import preprocess_data
import matplotlib.pyplot as plt

#读取测试测试文件
test_imgs, test_wheels = preprocess_data.load_data('test')
test_imgs = preprocess_data.nomorlize_image(test_imgs)

# 测试模型在测试文件上表现
test_loss= model.evaluate(test_imgs, test_wheels, batch_size=128)
print('\n Test loss is:{}'.format(test_loss))

# 查看训练集和验证集loss
plt.plot(history_object.history['loss'])
plt.plot(history_object.history['val_loss'])
plt.title('Nvidia model mean squared error loss')
plt.ylabel('mean squared error loss')
plt.xlabel('epoch')
plt.legend(['training set', 'validation set'], loc='upper right')
plt.show()
load data start!
The epoch10 mkv is processing
loading data filished!
2700/2700 [==============================] - 0s     

 Test loss is:4.479728813877812
In [6]:
### <5>保存model
import os

# model文件和json文件存储路径
model_saved_path = os.path.join('./models/', "nvidia_refined_model.h5")
json_saved_path = os.path.join('./models/', "nvidia_refined_model.json")

# 存储json
json_model = model.to_json()
with open(json_saved_path, "w") as json_file:
    json_file.write(json_model)
    
# 存储model
model.save(model_saved_path)

(3) Refined Model + 转向角度数据增加

In [1]:
### <1>用generator 生成NVIDIA refined model训练数据
import csv
import preprocess_data
import random
from sklearn.model_selection import train_test_split

#修改读取samples的function'read_csv',增加转向角度小于-5或大于5的sample
#原samples大小为24300,现增加为40000
def read_csv_add():
    """
    # Reading the csv file and store in the list
    """
    samples_original = []
    samples_steering = []
    with open('./train_images/train_images.csv') as csvfile:
        reader = csv.reader(csvfile)
        for line in reader:
            samples_original.append(line)
    samples_original = samples_original[1:]
        
    # 提取转向角度小于-5或大于5的sample
    for sample in samples_original:
        if (float(sample[0]) < -5.0) or (float(sample[0]) > 5.0):
            samples_steering.append(sample)
    
    #增加转向幅度大的sample
    while True:
        if len(samples_original) < 40000:
            #print(len(samples_steering))
            index = random.choice(range(len(samples_steering)))
            samples_original.append(samples_steering[index])
        else:
            break
        
    return samples_original

#载入训练数据
samples = read_csv_add()
print(len(samples))
40000
In [2]:
# 划分训练集和验证集
train_samples, validation_samples = train_test_split(samples, test_size=0.2)
train_generator = preprocess_data.generator(train_samples, batch_size=128)
validation_generator = preprocess_data.generator(validation_samples, batch_size=128)
In [3]:
### <2>建立NVIDIA refined model

from keras.models import Sequential
from keras.layers import Convolution2D, Dropout, Flatten, Dense, Lambda, BatchNormalization, MaxPooling2D

def Nvidia_refined_model():
    """
    Nvidia refined model
    """
    print("Nvidia-------------")
    model = Sequential()
    model.add(Convolution2D(24,5,5,subsample=(2,2),activation='elu',border_mode='same',init='he_normal',input_shape=(80,80,3)))
    model.add(BatchNormalization())
    model.add(MaxPooling2D())  
    
    model.add(Convolution2D(36,5,5,subsample=(2,2),activation='elu',border_mode='same',init='he_normal'))
    model.add(BatchNormalization())
    model.add(MaxPooling2D())
    
    model.add(Convolution2D(48,5,5,subsample=(2,2),activation='elu',border_mode='same',init='he_normal'))
    model.add(BatchNormalization())
    model.add(MaxPooling2D())
    
    model.add(Convolution2D(64,3,3,activation='elu',border_mode='same',init='he_normal'))
    model.add(BatchNormalization())
    
    model.add(Convolution2D(64,3,3,activation='elu',border_mode='same',init='he_normal'))
    model.add(BatchNormalization())
    
    model.add(Flatten())
    model.add(Dense(1164, activation='elu',init='he_normal'))
    model.add(Dropout(0.5))
    
    model.add(Dense(100, activation='elu',init='he_normal'))
    model.add(Dropout(0.3))
    
    model.add(Dense(50, activation='elu',init='he_normal'))
    model.add(Dropout(0.2))
    
    model.add(Dense(10, activation='elu',init='he_normal'))
    model.add(Dropout(0.1))
    
    model.add(Dense(1))

    return model
Using TensorFlow backend.
In [4]:
Nvidia_refined_model().summary()
Nvidia-------------
____________________________________________________________________________________________________
Layer (type)                     Output Shape          Param #     Connected to                     
====================================================================================================
convolution2d_1 (Convolution2D)  (None, 40, 40, 24)    1824        convolution2d_input_1[0][0]      
____________________________________________________________________________________________________
batchnormalization_1 (BatchNorma (None, 40, 40, 24)    96          convolution2d_1[0][0]            
____________________________________________________________________________________________________
maxpooling2d_1 (MaxPooling2D)    (None, 20, 20, 24)    0           batchnormalization_1[0][0]       
____________________________________________________________________________________________________
convolution2d_2 (Convolution2D)  (None, 10, 10, 36)    21636       maxpooling2d_1[0][0]             
____________________________________________________________________________________________________
batchnormalization_2 (BatchNorma (None, 10, 10, 36)    144         convolution2d_2[0][0]            
____________________________________________________________________________________________________
maxpooling2d_2 (MaxPooling2D)    (None, 5, 5, 36)      0           batchnormalization_2[0][0]       
____________________________________________________________________________________________________
convolution2d_3 (Convolution2D)  (None, 3, 3, 48)      43248       maxpooling2d_2[0][0]             
____________________________________________________________________________________________________
batchnormalization_3 (BatchNorma (None, 3, 3, 48)      192         convolution2d_3[0][0]            
____________________________________________________________________________________________________
maxpooling2d_3 (MaxPooling2D)    (None, 1, 1, 48)      0           batchnormalization_3[0][0]       
____________________________________________________________________________________________________
convolution2d_4 (Convolution2D)  (None, 1, 1, 64)      27712       maxpooling2d_3[0][0]             
____________________________________________________________________________________________________
batchnormalization_4 (BatchNorma (None, 1, 1, 64)      256         convolution2d_4[0][0]            
____________________________________________________________________________________________________
convolution2d_5 (Convolution2D)  (None, 1, 1, 64)      36928       batchnormalization_4[0][0]       
____________________________________________________________________________________________________
batchnormalization_5 (BatchNorma (None, 1, 1, 64)      256         convolution2d_5[0][0]            
____________________________________________________________________________________________________
flatten_1 (Flatten)              (None, 64)            0           batchnormalization_5[0][0]       
____________________________________________________________________________________________________
dense_1 (Dense)                  (None, 1164)          75660       flatten_1[0][0]                  
____________________________________________________________________________________________________
dropout_1 (Dropout)              (None, 1164)          0           dense_1[0][0]                    
____________________________________________________________________________________________________
dense_2 (Dense)                  (None, 100)           116500      dropout_1[0][0]                  
____________________________________________________________________________________________________
dropout_2 (Dropout)              (None, 100)           0           dense_2[0][0]                    
____________________________________________________________________________________________________
dense_3 (Dense)                  (None, 50)            5050        dropout_2[0][0]                  
____________________________________________________________________________________________________
dropout_3 (Dropout)              (None, 50)            0           dense_3[0][0]                    
____________________________________________________________________________________________________
dense_4 (Dense)                  (None, 10)            510         dropout_3[0][0]                  
____________________________________________________________________________________________________
dropout_4 (Dropout)              (None, 10)            0           dense_4[0][0]                    
____________________________________________________________________________________________________
dense_5 (Dense)                  (None, 1)             11          dropout_4[0][0]                  
====================================================================================================
Total params: 330,023
Trainable params: 329,551
Non-trainable params: 472
____________________________________________________________________________________________________
In [5]:
### <3>NVIDIA refined model模型训练


model = Nvidia_refined_model()
model.compile(loss='mse', optimizer='adam')
history_object = model.fit_generator(train_generator, samples_per_epoch= 
                                     len(train_samples)*3, validation_data=validation_generator, 
                                     nb_val_samples=len(validation_samples)*3,nb_epoch=10)
Nvidia-------------
Epoch 1/10
96000/96000 [==============================] - 265s - loss: 12.0565 - val_loss: 24.0982
Epoch 2/10
96000/96000 [==============================] - 61s - loss: 4.1674 - val_loss: 3.9496
Epoch 3/10
96000/96000 [==============================] - 61s - loss: 3.1119 - val_loss: 1.3425
Epoch 4/10
96000/96000 [==============================] - 60s - loss: 2.6613 - val_loss: 1.1643
Epoch 5/10
96000/96000 [==============================] - 62s - loss: 2.3794 - val_loss: 0.8630
Epoch 6/10
96000/96000 [==============================] - 61s - loss: 2.1848 - val_loss: 0.7885
Epoch 7/10
96000/96000 [==============================] - 62s - loss: 2.0637 - val_loss: 0.6422
Epoch 8/10
96000/96000 [==============================] - 62s - loss: 1.9930 - val_loss: 0.8113
Epoch 9/10
96000/96000 [==============================] - 63s - loss: 1.8898 - val_loss: 0.5942
Epoch 10/10
96000/96000 [==============================] - 63s - loss: 1.8166 - val_loss: 0.5717
In [6]:
### <4>分析结果
import preprocess_data
import matplotlib.pyplot as plt

#读取测试测试文件
test_imgs, test_wheels = preprocess_data.load_data('test')
test_imgs = preprocess_data.nomorlize_image(test_imgs)

# 测试模型在测试文件上表现
test_loss= model.evaluate(test_imgs, test_wheels, batch_size=128)
print('\n Test loss is:{}'.format(test_loss))

# 查看训练集和验证集loss
plt.plot(history_object.history['loss'])
plt.plot(history_object.history['val_loss'])
plt.title('Nvidia model mean squared error loss')
plt.ylabel('mean squared error loss')
plt.xlabel('epoch')
plt.legend(['training set', 'validation set'], loc='upper right')
plt.show()
load data start!
The epoch10 mkv is processing
loading data filished!
2700/2700 [==============================] - 0s     

 Test loss is:3.2823049792536985
In [7]:
### <5>保存model
import os

# model文件和json文件存储路径
model_saved_path = os.path.join('./models/', "nvidia_ra_model.h5")
json_saved_path = os.path.join('./models/', "nvidia_ra_model.json")

# 存储json
json_model = model.to_json()
with open(json_saved_path, "w") as json_file:
    json_file.write(json_model)
    
# 存储model
model.save(model_saved_path)

分析:增加转向幅度大的sample后,训练数据达到96000,模型在验证集和测试集上有所提升。说明增加后对模型是有帮助的,同时还需要结合时序图来分析对比预测转向和人工转向的差别。


2.VGG16 + Nvidia model

(1)notop + noblocks

In [1]:
### <1>.用generator生成VGG16 + Nvidia model 训练数据

import preprocess_data
from sklearn.model_selection import train_test_split

#载入训练数据
samples = preprocess_data.read_csv('./train_images/train_images.csv')
train_samples, validation_samples = train_test_split(samples, test_size=0.2)
train_generator = preprocess_data.generator(train_samples, batch_size=64)
validation_generator = preprocess_data.generator(validation_samples, batch_size=64)
In [2]:
### <2>VGG16 + Nvidia model

from keras.models import Sequential
from keras.layers import Convolution2D, Dropout, Flatten, Dense, Lambda, Input, GlobalAveragePooling2D
from keras.models import Model
from keras.applications.vgg16 import VGG16

def Vgg16_first_model():
    """
    Vgg16 model
    """
    print("Vgg16---------------")
    input_shape = (80,80,3)
    input_tensor = Input(shape=input_shape)

    base_model = VGG16(input_tensor=input_tensor, weights='imagenet', include_top=False)
    x = base_model.output
    x = GlobalAveragePooling2D()(x)
    x = Dense(512, activation='elu')(x)
    x = Dropout(0.5)(x)
    x = Dense(256, activation='elu')(x)
    x = Dropout(0.3)(x)
    x = Dense(64, activation='elu')(x)
    x = Dropout(0.1)(x)
    predictions = Dense(1, init='zero')(x)

    # the model we will train
    model = Model(input=base_model.input, output=predictions)

    return model
Using TensorFlow backend.
In [3]:
Vgg16_first_model().summary()
Vgg16---------------
____________________________________________________________________________________________________
Layer (type)                     Output Shape          Param #     Connected to                     
====================================================================================================
input_1 (InputLayer)             (None, 80, 80, 3)     0                                            
____________________________________________________________________________________________________
block1_conv1 (Convolution2D)     (None, 80, 80, 64)    1792        input_1[0][0]                    
____________________________________________________________________________________________________
block1_conv2 (Convolution2D)     (None, 80, 80, 64)    36928       block1_conv1[0][0]               
____________________________________________________________________________________________________
block1_pool (MaxPooling2D)       (None, 40, 40, 64)    0           block1_conv2[0][0]               
____________________________________________________________________________________________________
block2_conv1 (Convolution2D)     (None, 40, 40, 128)   73856       block1_pool[0][0]                
____________________________________________________________________________________________________
block2_conv2 (Convolution2D)     (None, 40, 40, 128)   147584      block2_conv1[0][0]               
____________________________________________________________________________________________________
block2_pool (MaxPooling2D)       (None, 20, 20, 128)   0           block2_conv2[0][0]               
____________________________________________________________________________________________________
block3_conv1 (Convolution2D)     (None, 20, 20, 256)   295168      block2_pool[0][0]                
____________________________________________________________________________________________________
block3_conv2 (Convolution2D)     (None, 20, 20, 256)   590080      block3_conv1[0][0]               
____________________________________________________________________________________________________
block3_conv3 (Convolution2D)     (None, 20, 20, 256)   590080      block3_conv2[0][0]               
____________________________________________________________________________________________________
block3_pool (MaxPooling2D)       (None, 10, 10, 256)   0           block3_conv3[0][0]               
____________________________________________________________________________________________________
block4_conv1 (Convolution2D)     (None, 10, 10, 512)   1180160     block3_pool[0][0]                
____________________________________________________________________________________________________
block4_conv2 (Convolution2D)     (None, 10, 10, 512)   2359808     block4_conv1[0][0]               
____________________________________________________________________________________________________
block4_conv3 (Convolution2D)     (None, 10, 10, 512)   2359808     block4_conv2[0][0]               
____________________________________________________________________________________________________
block4_pool (MaxPooling2D)       (None, 5, 5, 512)     0           block4_conv3[0][0]               
____________________________________________________________________________________________________
block5_conv1 (Convolution2D)     (None, 5, 5, 512)     2359808     block4_pool[0][0]                
____________________________________________________________________________________________________
block5_conv2 (Convolution2D)     (None, 5, 5, 512)     2359808     block5_conv1[0][0]               
____________________________________________________________________________________________________
block5_conv3 (Convolution2D)     (None, 5, 5, 512)     2359808     block5_conv2[0][0]               
____________________________________________________________________________________________________
block5_pool (MaxPooling2D)       (None, 2, 2, 512)     0           block5_conv3[0][0]               
____________________________________________________________________________________________________
globalaveragepooling2d_1 (Global (None, 512)           0           block5_pool[0][0]                
____________________________________________________________________________________________________
dense_1 (Dense)                  (None, 512)           262656      globalaveragepooling2d_1[0][0]   
____________________________________________________________________________________________________
dropout_1 (Dropout)              (None, 512)           0           dense_1[0][0]                    
____________________________________________________________________________________________________
dense_2 (Dense)                  (None, 256)           131328      dropout_1[0][0]                  
____________________________________________________________________________________________________
dropout_2 (Dropout)              (None, 256)           0           dense_2[0][0]                    
____________________________________________________________________________________________________
dense_3 (Dense)                  (None, 64)            16448       dropout_2[0][0]                  
____________________________________________________________________________________________________
dropout_3 (Dropout)              (None, 64)            0           dense_3[0][0]                    
____________________________________________________________________________________________________
dense_4 (Dense)                  (None, 1)             65          dropout_3[0][0]                  
====================================================================================================
Total params: 15,125,185
Trainable params: 15,125,185
Non-trainable params: 0
____________________________________________________________________________________________________
In [6]:
### <3>VGG16 + Nvidia模型训练
from keras.optimizers import Adam


model = Vgg16_first_model()

# 不选择blocks训练
for layer in model.layers[:19]:
    layer.trainable = False
for layer in model.layers[19:]:
    layer.trainable = True  

# Default parameters follow those provided in the original paper.
opt = Adam(lr=0.001, beta_1=0.9, beta_2=0.999, epsilon=1e-08, decay=0.6)
model.compile(optimizer=opt, loss='mse')
history_object = model.fit_generator(train_generator, samples_per_epoch= 
                                     len(train_samples)*3, validation_data=validation_generator, 
                                     nb_val_samples=len(validation_samples)*3,nb_epoch=5)
Vgg16---------------
Epoch 1/5
58320/58320 [==============================] - 109s - loss: 21.0579 - val_loss: 21.4298
Epoch 2/5
58320/58320 [==============================] - 108s - loss: 21.0543 - val_loss: 21.4272
Epoch 3/5
58320/58320 [==============================] - 108s - loss: 21.0524 - val_loss: 21.4257
Epoch 4/5
58320/58320 [==============================] - 108s - loss: 21.0525 - val_loss: 21.4244
Epoch 5/5
58320/58320 [==============================] - 108s - loss: 21.0515 - val_loss: 21.4235

分析:可以看出,不加入block进行训练,原始参数并不能很好提取图像的特征,导致模型无法很好的收敛。所以需要加入卷积层进行训练。

(2)notop + top2 bolocks

In [2]:
### <1>.用generator生成VGG16 + Nvidia model 训练数据

import preprocess_data
from sklearn.model_selection import train_test_split

#载入训练数据
samples = preprocess_data.read_csv('./train_images/train_images.csv')
train_samples, validation_samples = train_test_split(samples, test_size=0.2)
train_generator = preprocess_data.generator(train_samples, batch_size=64)
validation_generator = preprocess_data.generator(validation_samples, batch_size=64)
In [3]:
### <2>VGG16 + Nvidia model

from keras.models import Sequential
from keras.layers import Convolution2D, Dropout, Flatten, Dense, Lambda, Input, GlobalAveragePooling2D
from keras.models import Model
from keras.applications.vgg16 import VGG16

def Vgg16_transfer_model():
    """
    Vgg16 model
    """
    print("Vgg16---------------")
    input_shape = (80,80,3)
    input_tensor = Input(shape=input_shape)

    base_model = VGG16(input_tensor=input_tensor, weights='imagenet', include_top=False)
    x = base_model.output
    x = GlobalAveragePooling2D()(x)
    x = Dense(512, activation='elu')(x)
    x = Dropout(0.5)(x)
    x = Dense(256, activation='elu')(x)
    x = Dropout(0.3)(x)
    x = Dense(64, activation='elu')(x)
    x = Dropout(0.1)(x)
    predictions = Dense(1, init='zero')(x)

    # the model we will train
    model = Model(input=base_model.input, output=predictions)

    return model
Using TensorFlow backend.
In [4]:
Vgg16_transfer_model().summary()
Vgg16---------------
____________________________________________________________________________________________________
Layer (type)                     Output Shape          Param #     Connected to                     
====================================================================================================
input_1 (InputLayer)             (None, 80, 80, 3)     0                                            
____________________________________________________________________________________________________
block1_conv1 (Convolution2D)     (None, 80, 80, 64)    1792        input_1[0][0]                    
____________________________________________________________________________________________________
block1_conv2 (Convolution2D)     (None, 80, 80, 64)    36928       block1_conv1[0][0]               
____________________________________________________________________________________________________
block1_pool (MaxPooling2D)       (None, 40, 40, 64)    0           block1_conv2[0][0]               
____________________________________________________________________________________________________
block2_conv1 (Convolution2D)     (None, 40, 40, 128)   73856       block1_pool[0][0]                
____________________________________________________________________________________________________
block2_conv2 (Convolution2D)     (None, 40, 40, 128)   147584      block2_conv1[0][0]               
____________________________________________________________________________________________________
block2_pool (MaxPooling2D)       (None, 20, 20, 128)   0           block2_conv2[0][0]               
____________________________________________________________________________________________________
block3_conv1 (Convolution2D)     (None, 20, 20, 256)   295168      block2_pool[0][0]                
____________________________________________________________________________________________________
block3_conv2 (Convolution2D)     (None, 20, 20, 256)   590080      block3_conv1[0][0]               
____________________________________________________________________________________________________
block3_conv3 (Convolution2D)     (None, 20, 20, 256)   590080      block3_conv2[0][0]               
____________________________________________________________________________________________________
block3_pool (MaxPooling2D)       (None, 10, 10, 256)   0           block3_conv3[0][0]               
____________________________________________________________________________________________________
block4_conv1 (Convolution2D)     (None, 10, 10, 512)   1180160     block3_pool[0][0]                
____________________________________________________________________________________________________
block4_conv2 (Convolution2D)     (None, 10, 10, 512)   2359808     block4_conv1[0][0]               
____________________________________________________________________________________________________
block4_conv3 (Convolution2D)     (None, 10, 10, 512)   2359808     block4_conv2[0][0]               
____________________________________________________________________________________________________
block4_pool (MaxPooling2D)       (None, 5, 5, 512)     0           block4_conv3[0][0]               
____________________________________________________________________________________________________
block5_conv1 (Convolution2D)     (None, 5, 5, 512)     2359808     block4_pool[0][0]                
____________________________________________________________________________________________________
block5_conv2 (Convolution2D)     (None, 5, 5, 512)     2359808     block5_conv1[0][0]               
____________________________________________________________________________________________________
block5_conv3 (Convolution2D)     (None, 5, 5, 512)     2359808     block5_conv2[0][0]               
____________________________________________________________________________________________________
block5_pool (MaxPooling2D)       (None, 2, 2, 512)     0           block5_conv3[0][0]               
____________________________________________________________________________________________________
globalaveragepooling2d_1 (Global (None, 512)           0           block5_pool[0][0]                
____________________________________________________________________________________________________
dense_1 (Dense)                  (None, 512)           262656      globalaveragepooling2d_1[0][0]   
____________________________________________________________________________________________________
dropout_1 (Dropout)              (None, 512)           0           dense_1[0][0]                    
____________________________________________________________________________________________________
dense_2 (Dense)                  (None, 256)           131328      dropout_1[0][0]                  
____________________________________________________________________________________________________
dropout_2 (Dropout)              (None, 256)           0           dense_2[0][0]                    
____________________________________________________________________________________________________
dense_3 (Dense)                  (None, 64)            16448       dropout_2[0][0]                  
____________________________________________________________________________________________________
dropout_3 (Dropout)              (None, 64)            0           dense_3[0][0]                    
____________________________________________________________________________________________________
dense_4 (Dense)                  (None, 1)             65          dropout_3[0][0]                  
====================================================================================================
Total params: 15,125,185
Trainable params: 15,125,185
Non-trainable params: 0
____________________________________________________________________________________________________
In [4]:
### <3>VGG16 + Nvidia模型训练
from keras.optimizers import Adam


model = Vgg16_transfer_model()

# 选择top2 blocks训练
for layer in model.layers[:11]:
    layer.trainable = False
for layer in model.layers[11:]:
    layer.trainable = True  

# Default parameters follow those provided in the original paper.
opt = Adam(lr=0.001, beta_1=0.9, beta_2=0.999, epsilon=1e-08, decay=0.6)
model.compile(optimizer=opt, loss='mse')
history_object = model.fit_generator(train_generator, samples_per_epoch= 
                                     len(train_samples)*3, validation_data=validation_generator, 
                                     nb_val_samples=len(validation_samples)*3,nb_epoch=10)
Vgg16---------------
Epoch 1/10
58320/58320 [==============================] - 172s - loss: 14.5413 - val_loss: 10.7658
Epoch 2/10
58320/58320 [==============================] - 165s - loss: 10.0365 - val_loss: 8.2215
Epoch 3/10
58320/58320 [==============================] - 165s - loss: 8.2899 - val_loss: 7.0026
Epoch 4/10
58320/58320 [==============================] - 165s - loss: 7.2961 - val_loss: 6.2059
Epoch 5/10
58320/58320 [==============================] - 166s - loss: 6.6702 - val_loss: 5.7412
Epoch 6/10
58320/58320 [==============================] - 166s - loss: 6.2682 - val_loss: 5.3753
Epoch 7/10
58320/58320 [==============================] - 166s - loss: 5.9600 - val_loss: 5.1232
Epoch 8/10
58320/58320 [==============================] - 165s - loss: 5.6579 - val_loss: 4.8521
Epoch 9/10
58320/58320 [==============================] - 165s - loss: 5.4689 - val_loss: 4.6580
Epoch 10/10
58320/58320 [==============================] - 165s - loss: 5.3162 - val_loss: 4.5624
In [5]:
### <4>分析结果
import preprocess_data
import matplotlib.pyplot as plt

#读取测试测试文件
test_imgs, test_wheels = preprocess_data.load_data('test')
test_imgs = preprocess_data.nomorlize_image(test_imgs)

# 测试模型在测试文件上表现
test_loss= model.evaluate(test_imgs, test_wheels, batch_size=128)
print('\n Test loss is:{}'.format(test_loss))

# 查看训练集和验证集loss
plt.plot(history_object.history['loss'])
plt.plot(history_object.history['val_loss'])
plt.title('VGG16 + Nvidia mean squared error loss')
plt.ylabel('mean squared error loss')
plt.xlabel('epoch')
plt.legend(['training set', 'validation set'], loc='upper right')
plt.show()
load data start!
The epoch10 mkv is processing
loading data filished!
2700/2700 [==============================] - 6s     

 Test loss is:2.782660860132288
In [6]:
### <5>保存model
import os

# model文件和json文件存储路径
model_saved_path = os.path.join('./models/', "vgg16_model.h5")
json_saved_path = os.path.join('./models/', "vgg16_model.json")

# 存储json
json_model = model.to_json()
with open(json_saved_path, "w") as json_file:
    json_file.write(json_model)
    
# 存储model
model.save(model_saved_path)

(3)notop + top2 bolocks + 转向角度数据增加

In [1]:
### <1>用generator 生成Vgg16 + Nvidia 训练数据
import csv
import preprocess_data
import random
from sklearn.model_selection import train_test_split

#修改读取samples的function'read_csv',增加转向角度小于-5或大于5的sample
#原samples大小为24300,现增加为40000
def read_csv_add():
    """
    # Reading the csv file and store in the list
    """
    samples_original = []
    samples_steering = []
    with open('./train_images/train_images.csv') as csvfile:
        reader = csv.reader(csvfile)
        for line in reader:
            samples_original.append(line)
    samples_original = samples_original[1:]
        
    # 提取转向角度小于-5或大于5的sample
    for sample in samples_original:
        if (float(sample[0]) < -5.0) or (float(sample[0]) > 5.0):
            samples_steering.append(sample)
    
    #增加转向幅度大的sample
    while True:
        if len(samples_original) < 40000:
            #print(len(samples_steering))
            index = random.choice(range(len(samples_steering)))
            samples_original.append(samples_steering[index])
        else:
            break
        
    return samples_original

#载入训练数据
samples = read_csv_add()
print(len(samples))
40000
In [2]:
# 划分训练集和验证集
train_samples, validation_samples = train_test_split(samples, test_size=0.2)
train_generator = preprocess_data.generator(train_samples, batch_size=128)
validation_generator = preprocess_data.generator(validation_samples, batch_size=128)
In [3]:
### <2>VGG16 + Nvidia model

from keras.models import Sequential
from keras.layers import Convolution2D, Dropout, Flatten, Dense, Lambda, Input, GlobalAveragePooling2D
from keras.models import Model
from keras.applications.vgg16 import VGG16

def Vgg16_transfer_model():
    """
    Vgg16 model
    """
    print("Vgg16---------------")
    input_shape = (80,80,3)
    input_tensor = Input(shape=input_shape)

    base_model = VGG16(input_tensor=input_tensor, weights='imagenet', include_top=False)
    x = base_model.output
    x = GlobalAveragePooling2D()(x)
    x = Dense(512, activation='elu')(x)
    x = Dropout(0.5)(x)
    x = Dense(256, activation='elu')(x)
    x = Dropout(0.3)(x)
    x = Dense(64, activation='elu')(x)
    x = Dropout(0.1)(x)
    predictions = Dense(1, init='zero')(x)

    # the model we will train
    model = Model(input=base_model.input, output=predictions)

    return model
Using TensorFlow backend.
In [4]:
Vgg16_transfer_model().summary()
Vgg16---------------
____________________________________________________________________________________________________
Layer (type)                     Output Shape          Param #     Connected to                     
====================================================================================================
input_1 (InputLayer)             (None, 80, 80, 3)     0                                            
____________________________________________________________________________________________________
block1_conv1 (Convolution2D)     (None, 80, 80, 64)    1792        input_1[0][0]                    
____________________________________________________________________________________________________
block1_conv2 (Convolution2D)     (None, 80, 80, 64)    36928       block1_conv1[0][0]               
____________________________________________________________________________________________________
block1_pool (MaxPooling2D)       (None, 40, 40, 64)    0           block1_conv2[0][0]               
____________________________________________________________________________________________________
block2_conv1 (Convolution2D)     (None, 40, 40, 128)   73856       block1_pool[0][0]                
____________________________________________________________________________________________________
block2_conv2 (Convolution2D)     (None, 40, 40, 128)   147584      block2_conv1[0][0]               
____________________________________________________________________________________________________
block2_pool (MaxPooling2D)       (None, 20, 20, 128)   0           block2_conv2[0][0]               
____________________________________________________________________________________________________
block3_conv1 (Convolution2D)     (None, 20, 20, 256)   295168      block2_pool[0][0]                
____________________________________________________________________________________________________
block3_conv2 (Convolution2D)     (None, 20, 20, 256)   590080      block3_conv1[0][0]               
____________________________________________________________________________________________________
block3_conv3 (Convolution2D)     (None, 20, 20, 256)   590080      block3_conv2[0][0]               
____________________________________________________________________________________________________
block3_pool (MaxPooling2D)       (None, 10, 10, 256)   0           block3_conv3[0][0]               
____________________________________________________________________________________________________
block4_conv1 (Convolution2D)     (None, 10, 10, 512)   1180160     block3_pool[0][0]                
____________________________________________________________________________________________________
block4_conv2 (Convolution2D)     (None, 10, 10, 512)   2359808     block4_conv1[0][0]               
____________________________________________________________________________________________________
block4_conv3 (Convolution2D)     (None, 10, 10, 512)   2359808     block4_conv2[0][0]               
____________________________________________________________________________________________________
block4_pool (MaxPooling2D)       (None, 5, 5, 512)     0           block4_conv3[0][0]               
____________________________________________________________________________________________________
block5_conv1 (Convolution2D)     (None, 5, 5, 512)     2359808     block4_pool[0][0]                
____________________________________________________________________________________________________
block5_conv2 (Convolution2D)     (None, 5, 5, 512)     2359808     block5_conv1[0][0]               
____________________________________________________________________________________________________
block5_conv3 (Convolution2D)     (None, 5, 5, 512)     2359808     block5_conv2[0][0]               
____________________________________________________________________________________________________
block5_pool (MaxPooling2D)       (None, 2, 2, 512)     0           block5_conv3[0][0]               
____________________________________________________________________________________________________
globalaveragepooling2d_1 (Global (None, 512)           0           block5_pool[0][0]                
____________________________________________________________________________________________________
dense_1 (Dense)                  (None, 512)           262656      globalaveragepooling2d_1[0][0]   
____________________________________________________________________________________________________
dropout_1 (Dropout)              (None, 512)           0           dense_1[0][0]                    
____________________________________________________________________________________________________
dense_2 (Dense)                  (None, 256)           131328      dropout_1[0][0]                  
____________________________________________________________________________________________________
dropout_2 (Dropout)              (None, 256)           0           dense_2[0][0]                    
____________________________________________________________________________________________________
dense_3 (Dense)                  (None, 64)            16448       dropout_2[0][0]                  
____________________________________________________________________________________________________
dropout_3 (Dropout)              (None, 64)            0           dense_3[0][0]                    
____________________________________________________________________________________________________
dense_4 (Dense)                  (None, 1)             65          dropout_3[0][0]                  
====================================================================================================
Total params: 15,125,185
Trainable params: 15,125,185
Non-trainable params: 0
____________________________________________________________________________________________________
In [5]:
### <3>VGG16 + Nvidia模型训练
from keras.optimizers import Adam


model = Vgg16_transfer_model()

# 选择top2 blocks训练
for layer in model.layers[:11]:
    layer.trainable = False
for layer in model.layers[11:]:
    layer.trainable = True  

# Default parameters follow those provided in the original paper.
opt = Adam(lr=0.001, beta_1=0.9, beta_2=0.999, epsilon=1e-08, decay=0.6)
model.compile(optimizer=opt, loss='mse')
history_object = model.fit_generator(train_generator, samples_per_epoch= 
                                     len(train_samples)*3, validation_data=validation_generator, 
                                     nb_val_samples=len(validation_samples)*3,nb_epoch=10)
Vgg16---------------
Epoch 1/10
96000/96000 [==============================] - 262s - loss: 19.4276 - val_loss: 8.1972
Epoch 2/10
96000/96000 [==============================] - 257s - loss: 8.1509 - val_loss: 5.4931
Epoch 3/10
96000/96000 [==============================] - 257s - loss: 6.5458 - val_loss: 4.6628
Epoch 4/10
96000/96000 [==============================] - 257s - loss: 5.7366 - val_loss: 4.1017
Epoch 5/10
96000/96000 [==============================] - 258s - loss: 5.3208 - val_loss: 3.7532
Epoch 6/10
96000/96000 [==============================] - 258s - loss: 4.9637 - val_loss: 3.5392
Epoch 7/10
96000/96000 [==============================] - 257s - loss: 4.7431 - val_loss: 3.3827
Epoch 8/10
96000/96000 [==============================] - 258s - loss: 4.5388 - val_loss: 3.2475
Epoch 9/10
96000/96000 [==============================] - 258s - loss: 4.3775 - val_loss: 3.1380
Epoch 10/10
96000/96000 [==============================] - 258s - loss: 4.2552 - val_loss: 2.9988
In [6]:
### <4>分析结果
import preprocess_data
import matplotlib.pyplot as plt

#读取测试测试文件
test_imgs, test_wheels = preprocess_data.load_data('test')
test_imgs = preprocess_data.nomorlize_image(test_imgs)

# 测试模型在测试文件上表现
test_loss= model.evaluate(test_imgs, test_wheels, batch_size=128)
print('\n Test loss is:{}'.format(test_loss))

# 查看训练集和验证集loss
plt.plot(history_object.history['loss'])
plt.plot(history_object.history['val_loss'])
plt.title('VGG16 + Nvidia mean squared error loss')
plt.ylabel('mean squared error loss')
plt.xlabel('epoch')
plt.legend(['training set', 'validation set'], loc='upper right')
plt.show()
load data start!
The epoch10 mkv is processing
loading data filished!
2700/2700 [==============================] - 5s     

 Test loss is:3.54451801971153
In [7]:
### <5>保存model
import os

# model文件和json文件存储路径
model_saved_path = os.path.join('./models/', "vgg16_add_model.h5")
json_saved_path = os.path.join('./models/', "vgg16_add_model.json")

# 存储json
json_model = model.to_json()
with open(json_saved_path, "w") as json_file:
    json_file.write(json_model)
    
# 存储model
model.save(model_saved_path)

分析:同样,增加转向幅度大的sample后,训练数据达到96000,模型在验证集上有所提升,但测试集变化不大。再次说明数据增加后对模型帮助是有限的,同时还需要结合时序图来分析对比预测转向和人工转向的差别。


3.CNN+RNN seq2seq model (选做)

说明

In [1]:
### <1>导入使用的库文件

import tensorflow as tf
import numpy as np
import pandas as pd
import os
import csv
slim = tf.contrib.slim
In [2]:
### <2>定义相关宏参数

SEQ_LEN = 10 
BATCH_SIZE = 4 
LEFT_CONTEXT = 5

# 这些是输入图片的维度
HEIGHT = 80
WIDTH = 80
CHANNELS = 3 # RGB

# LSTM 保存模型 'state'(状态的参数)
RNN_SIZE = 32
RNN_PROJ = 32

# 要预测的是转向角度信息
CSV_HEADER = "timestamp,index,angle".split(",")
OUTPUTS = [CSV_HEADER[2]] 
OUTPUT_DIM = len(OUTPUTS)
In [3]:
### <3>输入输出的格式定义
### 通过读取csv文件生成形如 BATCH_SIZE x SEQ_LEN 矩阵的samples
### 然后使用'BatchGenerator'生成器每个sequence 向前看 LEFT_CONTEXT 的图片, 
### 通过生成器生成形如 [BATCH_SIZE, LEFT_CONTEXT + SEQ_LEN, HEIGHT, WIDTH, CHANNELS] 矩阵的训练数据

import cv2
class BatchGenerator(object):
    def __init__(self, sequence, seq_len, batch_size):
        self.sequence = sequence
        self.seq_len = seq_len
        self.batch_size = batch_size
        chunk_size = 1 + (len(sequence) - 1) // batch_size
        self.indices = [(i*chunk_size) % len(sequence) for i in range(batch_size)]
        
    def next(self):
        while True:
            output = []
            for i in range(self.batch_size):
                idx = self.indices[i]
                left_pad = self.sequence[idx - LEFT_CONTEXT:idx]
                if len(left_pad) < LEFT_CONTEXT:
                    left_pad = [self.sequence[0]] * (LEFT_CONTEXT - len(left_pad)) + left_pad
                assert len(left_pad) == LEFT_CONTEXT
                leftover = len(self.sequence) - idx
                if leftover >= self.seq_len:
                    result = self.sequence[idx:idx + self.seq_len]
                else:
                    result = self.sequence[idx:] + self.sequence[:self.seq_len - leftover]
                assert len(result) == self.seq_len
                #print(self.seq_len)
                #print(result)
                #print(left_pad)
                self.indices[i] = (idx + self.seq_len) % len(self.sequence)
                images, targets = zip(*result)
                images_left_pad, _ = zip(*left_pad)
                output.append((np.stack(images_left_pad + images), np.stack(targets)))
                #print(output)
            #output = [zip(*output)]
            #output = zip(*output)
            #output = zip(*output[0])
            output_zip = tuple(zip(*output))
            #print(np.hstack(list(output_zip)[0]))
            #print(np.hstack(list(output_zip)[1]))
            output_0 = np.hstack(list(output_zip)[0])# batch_size x (LEFT_CONTEXT + seq_len)
            output_1 = np.hstack(list(output_zip)[1]) # batch_size x seq_len x OUTPUT_DIM
            #output_0  = [cv2.imread(img) for img in output_0]
            output = []
            output.append(np.reshape(output_0, (self.batch_size,LEFT_CONTEXT + self.seq_len)))
            output.append(np.reshape(output_1, (self.batch_size,self.seq_len,OUTPUT_DIM)))
            return output
        
def read_csv(filename):
    with open(filename, 'r') as f:
        lines = [ln.strip().split(",")[0:2] for ln in f.readlines()]
        lines = lines[1:]
        lines = map(lambda x: (x[1], np.float32(x[0])), lines) # imagefile, outputs
        return lines

def process_csv(filename, val=20):
    sum_f = np.float64([0.0] * OUTPUT_DIM)
    sum_sq_f = np.float64([0.0] * OUTPUT_DIM)
    lines = read_csv(filename)
    # leave val% for validation
    train_seq = []
    valid_seq = []
    cnt = 0
    for ln in lines:
        if cnt < SEQ_LEN * BATCH_SIZE * (100 - val): 
            train_seq.append(ln)
            sum_f += ln[1]
            sum_sq_f += ln[1] * ln[1]
        else:
            valid_seq.append(ln)
        cnt += 1
        cnt %= SEQ_LEN * BATCH_SIZE * 100
    mean = sum_f / len(train_seq)
    var = sum_sq_f / len(train_seq) - mean * mean
    std = np.sqrt(var)
    print(len(train_seq), len(valid_seq))
    print (mean, std) # we will need these statistics to normalize the outputs (and ground truth inputs)
    return (train_seq, valid_seq), (mean, std)
In [4]:
# 读取训练和测试用的所有csv文件
(train_seq, valid_seq), (mean, std) = process_csv(filename="./train_images/train_images.csv", val=5) # concatenated interpolated.csv from rosbags
#test_seq = read_csv("./test_images/test_images.csv") # interpolated.csv for testset filled with dummy values 
(test_seq, _), (test_mean, test_std) = process_csv(filename="./test_images/test_images.csv", val=0)

# 读取测试用csv文件
test_wheel = pd.read_csv('./test_images/test_images.csv')
23100 1200
[-0.06969697] [ 4.60519558]
2700 0
[-1.82666667] [ 2.12810818]
In [5]:
### <4>定义"vision module" 和"the recurrent stateful cell."
### 代码具体说明可见:"solution-komanda.ipynb"文件

layer_norm = lambda x: tf.contrib.layers.layer_norm(inputs=x, center=True, scale=True, activation_fn=None, trainable=True)

def get_optimizer(loss, lrate):
    optimizer = tf.train.AdamOptimizer(learning_rate=lrate)
    gradvars = optimizer.compute_gradients(loss)
    gradients, v = zip(*gradvars)
    print([x.name for x in v])
    gradients, _ = tf.clip_by_global_norm(gradients, 15.0)
    return optimizer.apply_gradients(zip(gradients, v))

def apply_vision_simple(image, keep_prob, batch_size, seq_len, scope=None, reuse=None):
    video = tf.reshape(image, shape=[batch_size, LEFT_CONTEXT + seq_len, HEIGHT, WIDTH, CHANNELS])
    with tf.variable_scope(scope, 'Vision', [image], reuse=reuse):
        net = slim.convolution(video, num_outputs=64, kernel_size=[3,5,5], stride=[1,2,2], padding="VALID")
        net = tf.nn.dropout(x=net, keep_prob=keep_prob)
        aux1 = slim.fully_connected(tf.reshape(net[:, -seq_len:, :, :, :], [batch_size, seq_len, -1]), 128, activation_fn=None)
        
        net = slim.convolution(net, num_outputs=64, kernel_size=[2,5,5], stride=[1,2,2], padding="VALID")
        net = tf.nn.dropout(x=net, keep_prob=keep_prob)
        aux2 = slim.fully_connected(tf.reshape(net[:, -seq_len:, :, :, :], [batch_size, seq_len, -1]), 128, activation_fn=None)
        
        net = slim.convolution(net, num_outputs=64, kernel_size=[2,5,5], stride=[1,1,1], padding="VALID")
        net = tf.nn.dropout(x=net, keep_prob=keep_prob)
        aux3 = slim.fully_connected(tf.reshape(net[:, -seq_len:, :, :, :], [batch_size, seq_len, -1]), 128, activation_fn=None)
        
        net = slim.convolution(net, num_outputs=64, kernel_size=[2,5,5], stride=[1,1,1], padding="VALID")
        net = tf.nn.dropout(x=net, keep_prob=keep_prob)
        # at this point the tensor 'net' is of shape batch_size x seq_len x ...
        aux4 = slim.fully_connected(tf.reshape(net, [batch_size, seq_len, -1]), 128, activation_fn=None)
        
        net = slim.fully_connected(tf.reshape(net, [batch_size, seq_len, -1]), 1024, activation_fn=tf.nn.relu)
        net = tf.nn.dropout(x=net, keep_prob=keep_prob)
        net = slim.fully_connected(net, 512, activation_fn=tf.nn.relu)
        net = tf.nn.dropout(x=net, keep_prob=keep_prob)
        net = slim.fully_connected(net, 256, activation_fn=tf.nn.relu)
        net = tf.nn.dropout(x=net, keep_prob=keep_prob)
        net = slim.fully_connected(net, 128, activation_fn=None)
        return layer_norm(tf.nn.elu(net + aux1 + aux2 + aux3 + aux4)) # aux[1-4] are residual connections (shortcuts)

class SamplingRNNCell(tf.nn.rnn_cell.RNNCell):
  """Simple sampling RNN cell."""

  def __init__(self, num_outputs, use_ground_truth, internal_cell):
    """
    if use_ground_truth then don't sample
    """
    self._num_outputs = num_outputs
    self._use_ground_truth = use_ground_truth # boolean
    self._internal_cell = internal_cell # may be LSTM or GRU or anything
  
  @property
  def state_size(self):
    return self._num_outputs, self._internal_cell.state_size # previous output and bottleneck state

  @property
  def output_size(self):
    return self._num_outputs # steering angle, torque, vehicle speed

  def __call__(self, inputs, state, scope=None):
    (visual_feats, current_ground_truth) = inputs
    prev_output, prev_state_internal = state
    context = tf.concat(1, [prev_output, visual_feats])
    new_output_internal, new_state_internal = internal_cell(context, prev_state_internal) # here the internal cell (e.g. LSTM) is called
    new_output = tf.contrib.layers.fully_connected(
        inputs=tf.concat(1, [new_output_internal, prev_output, visual_feats]),
        num_outputs=self._num_outputs,
        activation_fn=None,
        scope="OutputProjection")
    # if self._use_ground_truth == True, we pass the ground truth as the state; otherwise, we use the model's predictions
    return new_output, (current_ground_truth if self._use_ground_truth else new_output, new_state_internal)
In [6]:
### <5>建立模型
### 具体说明可见:"solution-komanda.ipynb"文件

graph = tf.Graph()

with graph.as_default():
    # inputs  
    learning_rate = tf.placeholder_with_default(input=1e-4, shape=())
    keep_prob = tf.placeholder_with_default(input=1.0, shape=())
    aux_cost_weight = tf.placeholder_with_default(input=0.1, shape=())
    
    inputs = tf.placeholder(shape=(BATCH_SIZE,LEFT_CONTEXT+SEQ_LEN), dtype=tf.string) # pathes to png files from the central camera
    targets = tf.placeholder(shape=(BATCH_SIZE,SEQ_LEN,OUTPUT_DIM), dtype=tf.float32) # seq_len x batch_size x OUTPUT_DIM
    targets_normalized = (targets - mean) / std
    
    input_images = tf.pack([tf.image.decode_jpeg(tf.read_file(x))
                            for x in tf.unpack(tf.reshape(inputs, shape=[(LEFT_CONTEXT+SEQ_LEN) * BATCH_SIZE]))])
    input_images = -1.0 + 2.0 * tf.cast(input_images, tf.float32) / 255.0
    input_images.set_shape([(LEFT_CONTEXT+SEQ_LEN) * BATCH_SIZE, HEIGHT, WIDTH, CHANNELS])
    visual_conditions_reshaped = apply_vision_simple(image=input_images, keep_prob=keep_prob, 
                                                     batch_size=BATCH_SIZE, seq_len=SEQ_LEN)
    visual_conditions = tf.reshape(visual_conditions_reshaped, [BATCH_SIZE, SEQ_LEN, -1])
    visual_conditions = tf.nn.dropout(x=visual_conditions, keep_prob=keep_prob)
    
    rnn_inputs_with_ground_truth = (visual_conditions, targets_normalized)
    rnn_inputs_autoregressive = (visual_conditions, tf.zeros(shape=(BATCH_SIZE, SEQ_LEN, OUTPUT_DIM), dtype=tf.float32))
    
    internal_cell = tf.nn.rnn_cell.LSTMCell(num_units=RNN_SIZE, num_proj=RNN_PROJ)
    cell_with_ground_truth = SamplingRNNCell(num_outputs=OUTPUT_DIM, use_ground_truth=True, internal_cell=internal_cell)
    cell_autoregressive = SamplingRNNCell(num_outputs=OUTPUT_DIM, use_ground_truth=False, internal_cell=internal_cell)
    
    def get_initial_state(complex_state_tuple_sizes):
        flat_sizes = tf.nn.rnn_cell.nest.flatten(complex_state_tuple_sizes)
        init_state_flat = [tf.tile(
            multiples=[BATCH_SIZE, 1], 
            input=tf.get_variable("controller_initial_state_%d" % i, initializer=tf.zeros_initializer, shape=([1, s]), dtype=tf.float32))
         for i,s in enumerate(flat_sizes)]
        init_state = tf.nn.rnn_cell.nest.pack_sequence_as(complex_state_tuple_sizes, init_state_flat)
        return init_state
    def deep_copy_initial_state(complex_state_tuple):
        flat_state = tf.nn.rnn_cell.nest.flatten(complex_state_tuple)
        flat_copy = [tf.identity(s) for s in flat_state]
        deep_copy = tf.nn.rnn_cell.nest.pack_sequence_as(complex_state_tuple, flat_copy)
        return deep_copy
    
    controller_initial_state_variables = get_initial_state(cell_autoregressive.state_size)
    controller_initial_state_autoregressive = deep_copy_initial_state(controller_initial_state_variables)
    controller_initial_state_gt = deep_copy_initial_state(controller_initial_state_variables)

    with tf.variable_scope("predictor"):
        out_gt, controller_final_state_gt = tf.nn.dynamic_rnn(cell=cell_with_ground_truth, inputs=rnn_inputs_with_ground_truth, 
                          sequence_length=[SEQ_LEN]*BATCH_SIZE, initial_state=controller_initial_state_gt, dtype=tf.float32,
                          swap_memory=True, time_major=False)
    with tf.variable_scope("predictor", reuse=True):
        out_autoregressive, controller_final_state_autoregressive = tf.nn.dynamic_rnn(cell=cell_autoregressive, inputs=rnn_inputs_autoregressive, 
                          sequence_length=[SEQ_LEN]*BATCH_SIZE, initial_state=controller_initial_state_autoregressive, dtype=tf.float32,
                          swap_memory=True, time_major=False)
    
    mse_gt = tf.reduce_mean(tf.squared_difference(out_gt, targets_normalized))
    mse_autoregressive = tf.reduce_mean(tf.squared_difference(out_autoregressive, targets_normalized))
    mse_autoregressive_steering = tf.reduce_mean(tf.squared_difference(out_autoregressive[:, :, 0], targets_normalized[:, :, 0]))
    steering_predictions = (out_autoregressive[:, :, 0] * std[0]) + mean[0]
    
    total_loss = mse_autoregressive_steering + aux_cost_weight * (mse_gt + mse_autoregressive)
    
    optimizer = get_optimizer(total_loss, learning_rate)

    tf.summary.scalar("MAIN_TRAIN_METRIC__mse_autoregressive_steering", mse_autoregressive_steering)
    tf.summary.scalar("mse_gt", mse_gt)
    tf.summary.scalar("mse_autoregressive", mse_autoregressive)
    
    summaries = tf.merge_all_summaries()
    train_writer = tf.summary.FileWriter('v3/train_summary', graph=graph)
    valid_writer = tf.summary.FileWriter('v3/valid_summary', graph=graph)
    saver = tf.train.Saver(write_version=tf.train.SaverDef.V2)
['Vision/Conv/weights:0', 'Vision/Conv/biases:0', 'Vision/fully_connected/weights:0', 'Vision/fully_connected/biases:0', 'Vision/Conv_1/weights:0', 'Vision/Conv_1/biases:0', 'Vision/fully_connected_1/weights:0', 'Vision/fully_connected_1/biases:0', 'Vision/Conv_2/weights:0', 'Vision/Conv_2/biases:0', 'Vision/fully_connected_2/weights:0', 'Vision/fully_connected_2/biases:0', 'Vision/Conv_3/weights:0', 'Vision/Conv_3/biases:0', 'Vision/fully_connected_3/weights:0', 'Vision/fully_connected_3/biases:0', 'Vision/fully_connected_4/weights:0', 'Vision/fully_connected_4/biases:0', 'Vision/fully_connected_5/weights:0', 'Vision/fully_connected_5/biases:0', 'Vision/fully_connected_6/weights:0', 'Vision/fully_connected_6/biases:0', 'Vision/fully_connected_7/weights:0', 'Vision/fully_connected_7/biases:0', 'Vision/LayerNorm/beta:0', 'Vision/LayerNorm/gamma:0', 'controller_initial_state_0:0', 'controller_initial_state_1:0', 'controller_initial_state_2:0', 'predictor/RNN/LSTMCell/W_0:0', 'predictor/RNN/LSTMCell/B:0', 'predictor/RNN/LSTMCell/W_P_0:0', 'predictor/RNN/OutputProjection/weights:0', 'predictor/RNN/OutputProjection/biases:0']
WARNING:tensorflow:From <ipython-input-6-4d1293e86359>:72 in <module>.: merge_all_summaries (from tensorflow.python.ops.logging_ops) is deprecated and will be removed after 2016-11-30.
Instructions for updating:
Please switch to tf.summary.merge_all.
WARNING:tensorflow:From D:\Anaconda3\envs\carnd-term1\lib\site-packages\tensorflow\python\ops\logging_ops.py:264 in merge_all_summaries.: merge_summary (from tensorflow.python.ops.logging_ops) is deprecated and will be removed after 2016-11-30.
Instructions for updating:
Please switch to tf.summary.merge.
In [7]:
### <5>训练模型
### 具体说明可见:"solution-komanda.ipynb"文件

gpu_options = tf.GPUOptions(per_process_gpu_memory_fraction=1.0)

checkpoint_dir = os.getcwd() + "/v3"

global_train_step = 0
global_valid_step = 0


KEEP_PROB_TRAIN = 0.25

def do_epoch(session, sequences, mode):
    global global_train_step, global_valid_step
    test_predictions = {}
    valid_predictions = {}
    batch_generator = BatchGenerator(sequence=sequences, seq_len=SEQ_LEN, batch_size=BATCH_SIZE)
    total_num_steps = int(1 + (batch_generator.indices[1] - 1) // SEQ_LEN)
    controller_final_state_gt_cur, controller_final_state_autoregressive_cur = None, None
    acc_loss = np.float64(0.0)
    for step in range(total_num_steps):
        feed_inputs, feed_targets = batch_generator.next()
        feed_dict = {inputs : feed_inputs, targets : feed_targets}
        if controller_final_state_autoregressive_cur is not None:
            feed_dict.update({controller_initial_state_autoregressive : controller_final_state_autoregressive_cur})
        if controller_final_state_gt_cur is not None:
            feed_dict.update({controller_final_state_gt : controller_final_state_gt_cur})
        if mode == "train":
            feed_dict.update({keep_prob : KEEP_PROB_TRAIN})
            summary, _, loss, controller_final_state_gt_cur, controller_final_state_autoregressive_cur = \
                session.run([summaries, optimizer, mse_autoregressive_steering, controller_final_state_gt, controller_final_state_autoregressive],
                           feed_dict = feed_dict)
            train_writer.add_summary(summary, global_train_step)
            global_train_step += 1
        elif mode == "valid":
            model_predictions, summary, loss, controller_final_state_autoregressive_cur = \
                session.run([steering_predictions, summaries, mse_autoregressive_steering, controller_final_state_autoregressive],
                           feed_dict = feed_dict)
            valid_writer.add_summary(summary, global_valid_step)
            global_valid_step += 1  
            feed_inputs = feed_inputs[:, LEFT_CONTEXT:].flatten()
            steering_targets = feed_targets[:, :, 0].flatten()
            model_predictions = model_predictions.flatten()
            stats = np.stack([steering_targets, model_predictions, (steering_targets - model_predictions)**2])
            for i, img in enumerate(feed_inputs):
                valid_predictions[img] = stats[:, i]
        elif mode == "test":
            model_predictions, loss, controller_final_state_autoregressive_cur = \
                session.run([steering_predictions,mse_autoregressive_steering, controller_final_state_autoregressive],
                           feed_dict = feed_dict)           
            feed_inputs = feed_inputs[:, LEFT_CONTEXT:].flatten()
            steering_targets = feed_targets[:, :, 0].flatten()
            model_predictions = model_predictions.flatten()
            stats = np.stack([steering_targets, model_predictions, (steering_targets - model_predictions)**2])
            for i, img in enumerate(feed_inputs):
                test_predictions[img] = stats[:, i]
        if mode != "test":
            acc_loss += loss
            print('\r', step + 1, "/", total_num_steps,'The '+ mode + ' loss', acc_loss / (step+1))
    print()
    return (np.sqrt(acc_loss / total_num_steps), valid_predictions) if mode != "test" else (None, test_predictions)
    

NUM_EPOCHS=20

best_validation_score = None
with tf.Session(graph=graph) as session:
    session.run(tf.initialize_all_variables())
    print('Initialized')
    ckpt = tf.train.latest_checkpoint(checkpoint_dir)
    if ckpt:
        print("Restoring from", ckpt)
        saver.restore(sess=session, save_path=ckpt)
    for epoch in range(NUM_EPOCHS):
        print("Starting epoch %d" % epoch)
        print("Validation:")
        valid_score, valid_predictions = do_epoch(session=session, sequences=valid_seq, mode="valid")
        if best_validation_score is None: 
            best_validation_score = valid_score
        if valid_score < best_validation_score:
            saver.save(session, 'v3/checkpoint-sdc-ch2')
            best_validation_score = valid_score
            #print( '\r', "SAVED at epoch %d" % epoch)
            with open("v3/valid-predictions-epoch%d" % epoch, "w") as out:
                result = np.float64(0.0)
                for img, stats in valid_predictions.items():
                    #print(out, img, stats)
                    result += stats[-1]
            print("Validation MSE(val_loss):", result / len(valid_predictions))
            with open("v3/test-predictions-epoch%d" % epoch, "w") as out:
                _, test_predictions = do_epoch(session=session, sequences=test_seq, mode="test")
                #print("frame_id,steering_angle", file=out)
                result = np.float64(0.0)
                for img, stats in test_predictions.items():
                    #original_wheel = test_wheel[test_wheel['img_path']==img]['wheel']
                    #original_wheel_normalized = (float(original_wheel) - test_mean) / test_std
                    #result += np.square(float(pred) - float(original_wheel_normalized))
                    #img = img.replace("challenge_2/Test-final/center/", "")
                    #print("%s,%f" % (img, pred), file=out)  
                    result += stats[-1]
                
                print("Test MSE(test_loss):", result / len(test_predictions))  
                    
                    
        if epoch != NUM_EPOCHS - 1:
            print("Training")
            do_epoch(session=session, sequences=train_seq, mode="train")
WARNING:tensorflow:From <ipython-input-7-85165e56734f>:66 in <module>.: initialize_all_variables (from tensorflow.python.ops.variables) is deprecated and will be removed after 2017-03-02.
Instructions for updating:
Use `tf.global_variables_initializer` instead.
Initialized
Starting epoch 0
Validation:
 1 / 30 The valid loss 1.22847342491
 2 / 30 The valid loss 1.30617243052
 3 / 30 The valid loss 1.32276403904
 4 / 30 The valid loss 1.29203206301
 5 / 30 The valid loss 1.21434497833
 6 / 30 The valid loss 1.25092792511
 7 / 30 The valid loss 1.38976485389
 8 / 30 The valid loss 1.53838163614
 9 / 30 The valid loss 1.7575340271
 10 / 30 The valid loss 1.91051859856
 11 / 30 The valid loss 1.92740091411
 12 / 30 The valid loss 1.95399359862
 13 / 30 The valid loss 1.97200336823
 14 / 30 The valid loss 1.99875029496
 15 / 30 The valid loss 1.97106359005
 16 / 30 The valid loss 1.97924449295
 17 / 30 The valid loss 1.99758729514
 18 / 30 The valid loss 2.02590615882
 19 / 30 The valid loss 2.03331466098
 20 / 30 The valid loss 2.01283909082
 21 / 30 The valid loss 1.96998377073
 22 / 30 The valid loss 1.94439368898
 23 / 30 The valid loss 1.92486905015
 24 / 30 The valid loss 1.91744978229
 25 / 30 The valid loss 1.88586859226
 26 / 30 The valid loss 1.84927611626
 27 / 30 The valid loss 1.81145428728
 28 / 30 The valid loss 1.7709134881
 29 / 30 The valid loss 1.72623381327
 30 / 30 The valid loss 1.69248875777

Training
 1 / 578 The train loss 9.84366512299
 2 / 578 The train loss 8.36939239502
 3 / 578 The train loss 7.6472380956
 4 / 578 The train loss 7.4315123558
 5 / 578 The train loss 6.93853130341
 6 / 578 The train loss 6.59566688538
 7 / 578 The train loss 6.66885559899
 8 / 578 The train loss 6.55167818069
 9 / 578 The train loss 6.36601924896
 10 / 578 The train loss 6.58995685577
 11 / 578 The train loss 6.44905740565
 12 / 578 The train loss 6.42344820499
 13 / 578 The train loss 6.18164704396
 14 / 578 The train loss 6.02800767762
 15 / 578 The train loss 5.82611468633
 16 / 578 The train loss 5.65933337808
 17 / 578 The train loss 5.52630093518
 18 / 578 The train loss 5.43736101521
 19 / 578 The train loss 5.34862864645
 20 / 578 The train loss 5.21654934883
 21 / 578 The train loss 5.08712423415
 22 / 578 The train loss 4.95325220715
 23 / 578 The train loss 4.84278983655
 24 / 578 The train loss 4.71541872621
 25 / 578 The train loss 4.64515040398
 26 / 578 The train loss 4.56220798309
 27 / 578 The train loss 4.45934628999
 28 / 578 The train loss 4.3780022817
 29 / 578 The train loss 4.29011455487
 30 / 578 The train loss 4.19135899146
 31 / 578 The train loss 4.11428858003
 32 / 578 The train loss 4.02811782435
 33 / 578 The train loss 3.94298204509
 34 / 578 The train loss 3.86671134304
 35 / 578 The train loss 3.79821648257
 36 / 578 The train loss 3.72734497984
 37 / 578 The train loss 3.66112322421
 38 / 578 The train loss 3.60457846052
 39 / 578 The train loss 3.54151350107
 40 / 578 The train loss 3.47805323899
 41 / 578 The train loss 3.42648641365
 42 / 578 The train loss 3.36666142373
 43 / 578 The train loss 3.31629202255
 44 / 578 The train loss 3.25878701969
 45 / 578 The train loss 3.21149672402
 46 / 578 The train loss 3.15652722379
 47 / 578 The train loss 3.1028383501
 48 / 578 The train loss 3.05527851482
 49 / 578 The train loss 3.00176679662
 50 / 578 The train loss 2.9545246166
 51 / 578 The train loss 2.91405146204
 52 / 578 The train loss 2.87061388848
 53 / 578 The train loss 2.83195649959
 54 / 578 The train loss 2.79239652609
 55 / 578 The train loss 2.75280586102
 56 / 578 The train loss 2.71595588912
 57 / 578 The train loss 2.67951303354
 58 / 578 The train loss 2.64257676283
 59 / 578 The train loss 2.6134286144
 60 / 578 The train loss 2.5839203621
 61 / 578 The train loss 2.55053106439
 62 / 578 The train loss 2.52211697688
 63 / 578 The train loss 2.49212735466
 64 / 578 The train loss 2.45906356908
 65 / 578 The train loss 2.42787698461
 66 / 578 The train loss 2.40249097392
 67 / 578 The train loss 2.37526014981
 68 / 578 The train loss 2.35153020787
 69 / 578 The train loss 2.32750896051
 70 / 578 The train loss 2.30223340775
 71 / 578 The train loss 2.27570233714
 72 / 578 The train loss 2.2517717936
 73 / 578 The train loss 2.22905225052
 74 / 578 The train loss 2.20500161036
 75 / 578 The train loss 2.18344849825
 76 / 578 The train loss 2.1645661569
 77 / 578 The train loss 2.14073283719
 78 / 578 The train loss 2.1204605072
 79 / 578 The train loss 2.10007135174
 80 / 578 The train loss 2.08497132882
 81 / 578 The train loss 2.06674018907
 82 / 578 The train loss 2.04796554403
 83 / 578 The train loss 2.02973166
 84 / 578 The train loss 2.01270043424
 85 / 578 The train loss 1.99327235432
 86 / 578 The train loss 1.97495399519
 87 / 578 The train loss 1.95561109466
 88 / 578 The train loss 1.93788465451
 89 / 578 The train loss 1.92304772206
 90 / 578 The train loss 1.90564681855
 91 / 578 The train loss 1.88928882696
 92 / 578 The train loss 1.8746002608
 93 / 578 The train loss 1.86003764727
 94 / 578 The train loss 1.84405930372
 95 / 578 The train loss 1.82831892152
 96 / 578 The train loss 1.81178723493
 97 / 578 The train loss 1.79617848316
 98 / 578 The train loss 1.78118829885
 99 / 578 The train loss 1.76636563106
 100 / 578 The train loss 1.75292256892
 101 / 578 The train loss 1.73976051925
 102 / 578 The train loss 1.72707863824
 103 / 578 The train loss 1.7130259096
 104 / 578 The train loss 1.69993419573
 105 / 578 The train loss 1.68875365683
 106 / 578 The train loss 1.67699767396
 107 / 578 The train loss 1.66666053277
 108 / 578 The train loss 1.6548437948
 109 / 578 The train loss 1.64245027769
 110 / 578 The train loss 1.63036994528
 111 / 578 The train loss 1.61977894537
 112 / 578 The train loss 1.60904301011
 113 / 578 The train loss 1.59962279237
 114 / 578 The train loss 1.59152426793
 115 / 578 The train loss 1.5839580028
 116 / 578 The train loss 1.57745358708
 117 / 578 The train loss 1.57179284554
 118 / 578 The train loss 1.56809043329
 119 / 578 The train loss 1.56520626876
 120 / 578 The train loss 1.56111058046
 121 / 578 The train loss 1.55796242401
 122 / 578 The train loss 1.55595985348
 123 / 578 The train loss 1.55583668288
 124 / 578 The train loss 1.56002457055
 125 / 578 The train loss 1.5642752223
 126 / 578 The train loss 1.56596613167
 127 / 578 The train loss 1.56889546997
 128 / 578 The train loss 1.57283782074
 129 / 578 The train loss 1.58000539671
 130 / 578 The train loss 1.58432494906
 131 / 578 The train loss 1.58627931808
 132 / 578 The train loss 1.58401573562
 133 / 578 The train loss 1.57937079668
 134 / 578 The train loss 1.57483651122
 135 / 578 The train loss 1.57375110962
 136 / 578 The train loss 1.57155294892
 137 / 578 The train loss 1.57040503686
 138 / 578 The train loss 1.56603903269
 139 / 578 The train loss 1.56193917861
 140 / 578 The train loss 1.55655576331
 141 / 578 The train loss 1.55237601835
 142 / 578 The train loss 1.54624841751
 143 / 578 The train loss 1.54211726222
 144 / 578 The train loss 1.5387291991
 145 / 578 The train loss 1.53596441499
 146 / 578 The train loss 1.53110356894
 147 / 578 The train loss 1.52548151519
 148 / 578 The train loss 1.51945616184
 149 / 578 The train loss 1.51302779761
 150 / 578 The train loss 1.50633507888
 151 / 578 The train loss 1.49845906658
 152 / 578 The train loss 1.49105729339
 153 / 578 The train loss 1.4840383964
 154 / 578 The train loss 1.4766163944
 155 / 578 The train loss 1.47058075186
 156 / 578 The train loss 1.46395190156
 157 / 578 The train loss 1.45759576939
 158 / 578 The train loss 1.45114885195
 159 / 578 The train loss 1.44477486404
 160 / 578 The train loss 1.43822327573
 161 / 578 The train loss 1.43168008901
 162 / 578 The train loss 1.42508651537
 163 / 578 The train loss 1.41835435567
 164 / 578 The train loss 1.41209948172
 165 / 578 The train loss 1.40550359524
 166 / 578 The train loss 1.39881247892
 167 / 578 The train loss 1.39258979787
 168 / 578 The train loss 1.38991315485
 169 / 578 The train loss 1.39154137839
 170 / 578 The train loss 1.39331116431
 171 / 578 The train loss 1.39679498038
 172 / 578 The train loss 1.39706595935
 173 / 578 The train loss 1.40148190095
 174 / 578 The train loss 1.40452823865
 175 / 578 The train loss 1.40930656876
 176 / 578 The train loss 1.41144976663
 177 / 578 The train loss 1.41172070119
 178 / 578 The train loss 1.41285218784
 179 / 578 The train loss 1.41511930867
 180 / 578 The train loss 1.41513546142
 181 / 578 The train loss 1.41442044894
 182 / 578 The train loss 1.41204480123
 183 / 578 The train loss 1.41111513011
 184 / 578 The train loss 1.41389715121
 185 / 578 The train loss 1.41852835417
 186 / 578 The train loss 1.42915745929
 187 / 578 The train loss 1.44071801939
 188 / 578 The train loss 1.45219988106
 189 / 578 The train loss 1.46205519211
 190 / 578 The train loss 1.47908622183
 191 / 578 The train loss 1.49172891341
 192 / 578 The train loss 1.50240460628
 193 / 578 The train loss 1.51075422301
 194 / 578 The train loss 1.51641995237
 195 / 578 The train loss 1.51893924414
 196 / 578 The train loss 1.52012060157
 197 / 578 The train loss 1.52642909192
 198 / 578 The train loss 1.52664389123
 199 / 578 The train loss 1.52766394465
 200 / 578 The train loss 1.52973356098
 201 / 578 The train loss 1.53595528763
 202 / 578 The train loss 1.54723085124
 203 / 578 The train loss 1.55658477956
 204 / 578 The train loss 1.56212774533
 205 / 578 The train loss 1.56583897399
 206 / 578 The train loss 1.56527708543
 207 / 578 The train loss 1.56415702323
 208 / 578 The train loss 1.56794217601
 209 / 578 The train loss 1.57598204618
 210 / 578 The train loss 1.57956621335
 211 / 578 The train loss 1.5834902556
 212 / 578 The train loss 1.5872777051
 213 / 578 The train loss 1.58747952561
 214 / 578 The train loss 1.58909006514
 215 / 578 The train loss 1.59037636241
 216 / 578 The train loss 1.589665273
 217 / 578 The train loss 1.58706966221
 218 / 578 The train loss 1.58504043451
 219 / 578 The train loss 1.58409477559
 220 / 578 The train loss 1.5854181818
 221 / 578 The train loss 1.58606492745
 222 / 578 The train loss 1.58719963721
 223 / 578 The train loss 1.58801084757
 224 / 578 The train loss 1.59042388813
 225 / 578 The train loss 1.59395369609
 226 / 578 The train loss 1.59748630054
 227 / 578 The train loss 1.60059439427
 228 / 578 The train loss 1.60302775758
 229 / 578 The train loss 1.60516188525
 230 / 578 The train loss 1.60760229437
 231 / 578 The train loss 1.61105502194
 232 / 578 The train loss 1.61276539621
 233 / 578 The train loss 1.61499060147
 234 / 578 The train loss 1.6183057216
 235 / 578 The train loss 1.62233623114
 236 / 578 The train loss 1.62410183857
 237 / 578 The train loss 1.62199404748
 238 / 578 The train loss 1.61849675885
 239 / 578 The train loss 1.61521708491
 240 / 578 The train loss 1.61240129446
 241 / 578 The train loss 1.61066764742
 242 / 578 The train loss 1.60788245413
 243 / 578 The train loss 1.60654847097
 244 / 578 The train loss 1.60620820693
 245 / 578 The train loss 1.60646115444
 246 / 578 The train loss 1.60583182588
 247 / 578 The train loss 1.60346735658
 248 / 578 The train loss 1.60035980781
 249 / 578 The train loss 1.59663574068
 250 / 578 The train loss 1.59639198661
 251 / 578 The train loss 1.59557349677
 252 / 578 The train loss 1.59536960035
 253 / 578 The train loss 1.59673715792
 254 / 578 The train loss 1.60100668176
 255 / 578 The train loss 1.60608840386
 256 / 578 The train loss 1.61342922994
 257 / 578 The train loss 1.61825293071
 258 / 578 The train loss 1.62212912634
 259 / 578 The train loss 1.62287039025
 260 / 578 The train loss 1.62349695678
 261 / 578 The train loss 1.62323668154
 262 / 578 The train loss 1.62325358141
 263 / 578 The train loss 1.62420270266
 264 / 578 The train loss 1.62580243724
 265 / 578 The train loss 1.62568876676
 266 / 578 The train loss 1.62625559179
 267 / 578 The train loss 1.62742886896
 268 / 578 The train loss 1.63063437258
 269 / 578 The train loss 1.63526451787
 270 / 578 The train loss 1.6413225781
 271 / 578 The train loss 1.64653332449
 272 / 578 The train loss 1.65092779937
 273 / 578 The train loss 1.65434647545
 274 / 578 The train loss 1.65852977216
 275 / 578 The train loss 1.66279484294
 276 / 578 The train loss 1.66607230103
 277 / 578 The train loss 1.66919050471
 278 / 578 The train loss 1.67121472183
 279 / 578 The train loss 1.67501051687
 280 / 578 The train loss 1.67880024207
 281 / 578 The train loss 1.68245847433
 282 / 578 The train loss 1.68549874129
 283 / 578 The train loss 1.68866348077
 284 / 578 The train loss 1.69159709379
 285 / 578 The train loss 1.69321569037
 286 / 578 The train loss 1.69550172995
 287 / 578 The train loss 1.69871312878
 288 / 578 The train loss 1.7007989931
 289 / 578 The train loss 1.70261672191
 290 / 578 The train loss 1.70566102205
 291 / 578 The train loss 1.70861118499
 292 / 578 The train loss 1.71177185868
 293 / 578 The train loss 1.71508688548
 294 / 578 The train loss 1.71926574865
 295 / 578 The train loss 1.7228454806
 296 / 578 The train loss 1.72572649713
 297 / 578 The train loss 1.72732898503
 298 / 578 The train loss 1.72856708721
 299 / 578 The train loss 1.73032234325
 300 / 578 The train loss 1.73245443165
 301 / 578 The train loss 1.73508488439
 302 / 578 The train loss 1.7379446904
 303 / 578 The train loss 1.74074678433
 304 / 578 The train loss 1.74199178952
 305 / 578 The train loss 1.7419733835
 306 / 578 The train loss 1.74322418664
 307 / 578 The train loss 1.7440018982
 308 / 578 The train loss 1.74391960885
 309 / 578 The train loss 1.74466441731
 310 / 578 The train loss 1.74272715449
 311 / 578 The train loss 1.73992620308
 312 / 578 The train loss 1.73698647129
 313 / 578 The train loss 1.73394999184
 314 / 578 The train loss 1.73160531358
 315 / 578 The train loss 1.72904378989
 316 / 578 The train loss 1.72625914291
 317 / 578 The train loss 1.72384193511
 318 / 578 The train loss 1.72126015458
 319 / 578 The train loss 1.71866844515
 320 / 578 The train loss 1.71554709505
 321 / 578 The train loss 1.71189262098
 322 / 578 The train loss 1.70816414104
 323 / 578 The train loss 1.70459793983
 324 / 578 The train loss 1.70035077061
 325 / 578 The train loss 1.69669572665
 326 / 578 The train loss 1.69305280271
 327 / 578 The train loss 1.68904889487
 328 / 578 The train loss 1.68524367526
 329 / 578 The train loss 1.6818154976
 330 / 578 The train loss 1.6782347419
 331 / 578 The train loss 1.67451861024
 332 / 578 The train loss 1.67120331741
 333 / 578 The train loss 1.66767937786
 334 / 578 The train loss 1.66534419545
 335 / 578 The train loss 1.66373571851
 336 / 578 The train loss 1.66304894501
 337 / 578 The train loss 1.66652179472
 338 / 578 The train loss 1.66991892016
 339 / 578 The train loss 1.67356912858
 340 / 578 The train loss 1.67401372685
 341 / 578 The train loss 1.67259796204
 342 / 578 The train loss 1.67021759979
 343 / 578 The train loss 1.66769005345
 344 / 578 The train loss 1.66454617766
 345 / 578 The train loss 1.66243985459
 346 / 578 The train loss 1.66073616805
 347 / 578 The train loss 1.65925126564
 348 / 578 The train loss 1.65784200409
 349 / 578 The train loss 1.65683290712
 350 / 578 The train loss 1.65574410609
 351 / 578 The train loss 1.6553706053
 352 / 578 The train loss 1.65435549888
 353 / 578 The train loss 1.65272055672
 354 / 578 The train loss 1.65137479029
 355 / 578 The train loss 1.65078886865
 356 / 578 The train loss 1.64986623773
 357 / 578 The train loss 1.64920790155
 358 / 578 The train loss 1.64764642482
 359 / 578 The train loss 1.64592838354
 360 / 578 The train loss 1.64428686963
 361 / 578 The train loss 1.64310649128
 362 / 578 The train loss 1.64135440937
 363 / 578 The train loss 1.63912331335
 364 / 578 The train loss 1.63679510289
 365 / 578 The train loss 1.63479098229
 366 / 578 The train loss 1.63115485004
 367 / 578 The train loss 1.62716210039
 368 / 578 The train loss 1.62320837221
 369 / 578 The train loss 1.61916137473
 370 / 578 The train loss 1.61530822689
 371 / 578 The train loss 1.61142583802
 372 / 578 The train loss 1.60762735821
 373 / 578 The train loss 1.60381411411
 374 / 578 The train loss 1.60028071518
 375 / 578 The train loss 1.59658535326
 376 / 578 The train loss 1.59283816656
 377 / 578 The train loss 1.58911585468
 378 / 578 The train loss 1.58555990852
 379 / 578 The train loss 1.58200384275
 380 / 578 The train loss 1.57838699061
 381 / 578 The train loss 1.57530177817
 382 / 578 The train loss 1.57233494633
 383 / 578 The train loss 1.56953910207
 384 / 578 The train loss 1.56644503349
 385 / 578 The train loss 1.56331716524
 386 / 578 The train loss 1.55974480637
 387 / 578 The train loss 1.55652300498
 388 / 578 The train loss 1.55339701961
 389 / 578 The train loss 1.55027646466
 390 / 578 The train loss 1.54689440139
 391 / 578 The train loss 1.54359028002
 392 / 578 The train loss 1.5400317011
 393 / 578 The train loss 1.53672817634
 394 / 578 The train loss 1.53338239913
 395 / 578 The train loss 1.52982067436
 396 / 578 The train loss 1.52691767244
 397 / 578 The train loss 1.52434869009
 398 / 578 The train loss 1.52187004633
 399 / 578 The train loss 1.51915259521
 400 / 578 The train loss 1.51636760511
 401 / 578 The train loss 1.51386732406
 402 / 578 The train loss 1.51119429511
 403 / 578 The train loss 1.50924373708
 404 / 578 The train loss 1.50709120708
 405 / 578 The train loss 1.50504031181
 406 / 578 The train loss 1.50291537211
 407 / 578 The train loss 1.50119139477
 408 / 578 The train loss 1.49933238459
 409 / 578 The train loss 1.49971129128
 410 / 578 The train loss 1.50214732028
 411 / 578 The train loss 1.50365772836
 412 / 578 The train loss 1.50415666083
 413 / 578 The train loss 1.50398789783
 414 / 578 The train loss 1.50482310952
 415 / 578 The train loss 1.50508089022
 416 / 578 The train loss 1.50404201319
 417 / 578 The train loss 1.50294981057
 418 / 578 The train loss 1.50181968831
 419 / 578 The train loss 1.50070634363
 420 / 578 The train loss 1.49987229095
 421 / 578 The train loss 1.49898811148
 422 / 578 The train loss 1.49761489768
 423 / 578 The train loss 1.49680900588
 424 / 578 The train loss 1.49648200496
 425 / 578 The train loss 1.49503165231
 426 / 578 The train loss 1.49346619257
 427 / 578 The train loss 1.49173541147
 428 / 578 The train loss 1.48970105186
 429 / 578 The train loss 1.48763140432
 430 / 578 The train loss 1.48498695742
 431 / 578 The train loss 1.48217968383
 432 / 578 The train loss 1.47948264883
 433 / 578 The train loss 1.47647907559
 434 / 578 The train loss 1.47338039943
 435 / 578 The train loss 1.47034600319
 436 / 578 The train loss 1.4673167261
 437 / 578 The train loss 1.46454636255
 438 / 578 The train loss 1.4622768162
 439 / 578 The train loss 1.46015650058
 440 / 578 The train loss 1.45811235969
 441 / 578 The train loss 1.45582415522
 442 / 578 The train loss 1.45349500821
 443 / 578 The train loss 1.45136192269
 444 / 578 The train loss 1.45011834757
 445 / 578 The train loss 1.44913978654
 446 / 578 The train loss 1.44849116596
 447 / 578 The train loss 1.44785448392
 448 / 578 The train loss 1.44682076158
 449 / 578 The train loss 1.44592591951
 450 / 578 The train loss 1.44456569377
 451 / 578 The train loss 1.44361492849
 452 / 578 The train loss 1.44253811
 453 / 578 The train loss 1.44118113651
 454 / 578 The train loss 1.44008326002
 455 / 578 The train loss 1.43850890591
 456 / 578 The train loss 1.43684305012
 457 / 578 The train loss 1.43505518791
 458 / 578 The train loss 1.43304480642
 459 / 578 The train loss 1.4313137453
 460 / 578 The train loss 1.42975519828
 461 / 578 The train loss 1.42811736688
 462 / 578 The train loss 1.4261994328
 463 / 578 The train loss 1.42464192379
 464 / 578 The train loss 1.42221841052
 465 / 578 The train loss 1.41981585817
 466 / 578 The train loss 1.41745148315
 467 / 578 The train loss 1.41498010767
 468 / 578 The train loss 1.4125803156
 469 / 578 The train loss 1.41032391523
 470 / 578 The train loss 1.40812069916
 471 / 578 The train loss 1.40606255677
 472 / 578 The train loss 1.40386548654
 473 / 578 The train loss 1.4017878039
 474 / 578 The train loss 1.39952696064
 475 / 578 The train loss 1.39727523556
 476 / 578 The train loss 1.3951733457
 477 / 578 The train loss 1.39293857331
 478 / 578 The train loss 1.39052790023
 479 / 578 The train loss 1.38800881352
 480 / 578 The train loss 1.38555215579
 481 / 578 The train loss 1.38312507314
 482 / 578 The train loss 1.38056506795
 483 / 578 The train loss 1.3780954926
 484 / 578 The train loss 1.37558570292
 485 / 578 The train loss 1.37320695873
 486 / 578 The train loss 1.37089319021
 487 / 578 The train loss 1.36844719016
 488 / 578 The train loss 1.36609858784
 489 / 578 The train loss 1.36368053076
 490 / 578 The train loss 1.36132111726
 491 / 578 The train loss 1.35888813527
 492 / 578 The train loss 1.35648856557
 493 / 578 The train loss 1.35418054981
 494 / 578 The train loss 1.35174288368
 495 / 578 The train loss 1.34961194715
 496 / 578 The train loss 1.34766892314
 497 / 578 The train loss 1.34580954358
 498 / 578 The train loss 1.34402044871
 499 / 578 The train loss 1.34209804359
 500 / 578 The train loss 1.34002829021
 501 / 578 The train loss 1.33780091867
 502 / 578 The train loss 1.33559197036
 503 / 578 The train loss 1.33363583906
 504 / 578 The train loss 1.33163740181
 505 / 578 The train loss 1.3296899165
 506 / 578 The train loss 1.32758306364
 507 / 578 The train loss 1.32545600717
 508 / 578 The train loss 1.32313637875
 509 / 578 The train loss 1.32078089534
 510 / 578 The train loss 1.31846296001
 511 / 578 The train loss 1.31624912694
 512 / 578 The train loss 1.31399258049
 513 / 578 The train loss 1.31168094916
 514 / 578 The train loss 1.30937190115
 515 / 578 The train loss 1.30711896901
 516 / 578 The train loss 1.3049591818
 517 / 578 The train loss 1.30284423762
 518 / 578 The train loss 1.30069711777
 519 / 578 The train loss 1.29861703086
 520 / 578 The train loss 1.29649998874
 521 / 578 The train loss 1.29445909303
 522 / 578 The train loss 1.29219766419
 523 / 578 The train loss 1.29019641331
 524 / 578 The train loss 1.28847568649
 525 / 578 The train loss 1.28651582997
 526 / 578 The train loss 1.28475477522
 527 / 578 The train loss 1.28292782535
 528 / 578 The train loss 1.28130720337
 529 / 578 The train loss 1.27980938978
 530 / 578 The train loss 1.27820918342
 531 / 578 The train loss 1.27647934482
 532 / 578 The train loss 1.27480241652
 533 / 578 The train loss 1.27309768566
 534 / 578 The train loss 1.2712262816
 535 / 578 The train loss 1.26932260462
 536 / 578 The train loss 1.26739740184
 537 / 578 The train loss 1.26548258634
 538 / 578 The train loss 1.26363289859
 539 / 578 The train loss 1.26186459011
 540 / 578 The train loss 1.26011506588
 541 / 578 The train loss 1.25840308721
 542 / 578 The train loss 1.25691020702
 543 / 578 The train loss 1.25518298894
 544 / 578 The train loss 1.25366067299
 545 / 578 The train loss 1.25242952277
 546 / 578 The train loss 1.25226567499
 547 / 578 The train loss 1.25264215931
 548 / 578 The train loss 1.25445160535
 549 / 578 The train loss 1.25497647695
 550 / 578 The train loss 1.25474603679
 551 / 578 The train loss 1.25435364056
 552 / 578 The train loss 1.25391982975
 553 / 578 The train loss 1.25355170307
 554 / 578 The train loss 1.25322358944
 555 / 578 The train loss 1.25241886972
 556 / 578 The train loss 1.25157541169
 557 / 578 The train loss 1.25092592171
 558 / 578 The train loss 1.25022478561
 559 / 578 The train loss 1.24963046084
 560 / 578 The train loss 1.24915469644
 561 / 578 The train loss 1.24862399622
 562 / 578 The train loss 1.24810766647
 563 / 578 The train loss 1.24737608225
 564 / 578 The train loss 1.24585133288
 565 / 578 The train loss 1.24423871081
 566 / 578 The train loss 1.24250777844
 567 / 578 The train loss 1.24086058727
 568 / 578 The train loss 1.23930011956
 569 / 578 The train loss 1.23762003047
 570 / 578 The train loss 1.23588562099
 571 / 578 The train loss 1.23416721491
 572 / 578 The train loss 1.23271814425
 573 / 578 The train loss 1.23215758019
 574 / 578 The train loss 1.23145601246
 575 / 578 The train loss 1.23027329744
 576 / 578 The train loss 1.22933373692
 577 / 578 The train loss 1.22837022591
 578 / 578 The train loss 1.22749360136

Starting epoch 1
Validation:
 1 / 30 The valid loss 0.461220532656
 2 / 30 The valid loss 0.435441240668
 3 / 30 The valid loss 0.413019051154
 4 / 30 The valid loss 0.388976730406
 5 / 30 The valid loss 0.368342584372
 6 / 30 The valid loss 0.407765770952
 7 / 30 The valid loss 0.521570056677
 8 / 30 The valid loss 0.66619252041
 9 / 30 The valid loss 0.813994119565
 10 / 30 The valid loss 0.894038441777
 11 / 30 The valid loss 0.911667046222
 12 / 30 The valid loss 0.922555280228
 13 / 30 The valid loss 0.93134905971
 14 / 30 The valid loss 0.936072607126
 15 / 30 The valid loss 0.919061885277
 16 / 30 The valid loss 0.911798747256
 17 / 30 The valid loss 0.924880331053
 18 / 30 The valid loss 0.934616315696
 19 / 30 The valid loss 0.934962769872
 20 / 30 The valid loss 0.928810422122
 21 / 30 The valid loss 0.914856412581
 22 / 30 The valid loss 0.901876870881
 23 / 30 The valid loss 0.890152232802
 24 / 30 The valid loss 0.881474620352
 25 / 30 The valid loss 0.873161503077
 26 / 30 The valid loss 0.869586436794
 27 / 30 The valid loss 0.858789041086
 28 / 30 The valid loss 0.839347845742
 29 / 30 The valid loss 0.817362181072
 30 / 30 The valid loss 0.79903665185

Validation MSE(val_loss): 16.9458305262

Test MSE(test_loss): 5.62044679525
Training
 1 / 578 The train loss 0.656893014908
 2 / 578 The train loss 0.51439127326
 3 / 578 The train loss 0.450964212418
 4 / 578 The train loss 0.425492420793
 5 / 578 The train loss 0.399875622988
 6 / 578 The train loss 0.363117776811
 7 / 578 The train loss 0.335625186563
 8 / 578 The train loss 0.315836992115
 9 / 578 The train loss 0.292885882159
 10 / 578 The train loss 0.277139987797
 11 / 578 The train loss 0.264178500257
 12 / 578 The train loss 0.255386890844
 13 / 578 The train loss 0.251184011308
 14 / 578 The train loss 0.258106558983
 15 / 578 The train loss 0.260125276943
 16 / 578 The train loss 0.259601559024
 17 / 578 The train loss 0.257349094486
 18 / 578 The train loss 0.251266757233
 19 / 578 The train loss 0.244739772066
 20 / 578 The train loss 0.238786169514
 21 / 578 The train loss 0.232277642758
 22 / 578 The train loss 0.227042474869
 23 / 578 The train loss 0.221370988566
 24 / 578 The train loss 0.216529093062
 25 / 578 The train loss 0.213741371036
 26 / 578 The train loss 0.209614010384
 27 / 578 The train loss 0.206228588742
 28 / 578 The train loss 0.204224795369
 29 / 578 The train loss 0.202027257917
 30 / 578 The train loss 0.201041421046
 31 / 578 The train loss 0.199986364332
 32 / 578 The train loss 0.196609286126
 33 / 578 The train loss 0.19690700643
 34 / 578 The train loss 0.197788525154
 35 / 578 The train loss 0.201017017875
 36 / 578 The train loss 0.206119137506
 37 / 578 The train loss 0.211359546797
 38 / 578 The train loss 0.215108318548
 39 / 578 The train loss 0.216912112939
 40 / 578 The train loss 0.217340109125
 41 / 578 The train loss 0.217207977321
 42 / 578 The train loss 0.216916013332
 43 / 578 The train loss 0.218226962311
 44 / 578 The train loss 0.219618033279
 45 / 578 The train loss 0.219488375054
 46 / 578 The train loss 0.218842970288
 47 / 578 The train loss 0.217847958524
 48 / 578 The train loss 0.21547418957
 49 / 578 The train loss 0.214290551689
 50 / 578 The train loss 0.215696783364
 51 / 578 The train loss 0.217336607622
 52 / 578 The train loss 0.218742664903
 53 / 578 The train loss 0.220638734833
 54 / 578 The train loss 0.220838974747
 55 / 578 The train loss 0.219859360294
 56 / 578 The train loss 0.219091082258
 57 / 578 The train loss 0.22038644134
 58 / 578 The train loss 0.221880807959
 59 / 578 The train loss 0.22574362967
 60 / 578 The train loss 0.231338033577
 61 / 578 The train loss 0.240988778775
 62 / 578 The train loss 0.250158797829
 63 / 578 The train loss 0.259307963981
 64 / 578 The train loss 0.265895504039
 65 / 578 The train loss 0.270350252207
 66 / 578 The train loss 0.272202163935
 67 / 578 The train loss 0.272752081725
 68 / 578 The train loss 0.272967769819
 69 / 578 The train loss 0.274060059285
 70 / 578 The train loss 0.277240884304
 71 / 578 The train loss 0.280925093402
 72 / 578 The train loss 0.28644066304
 73 / 578 The train loss 0.289900208173
 74 / 578 The train loss 0.293082596482
 75 / 578 The train loss 0.295621138016
 76 / 578 The train loss 0.296998007125
 77 / 578 The train loss 0.298964181504
 78 / 578 The train loss 0.301299771437
 79 / 578 The train loss 0.303168685753
 80 / 578 The train loss 0.304361683503
 81 / 578 The train loss 0.304352026663
 82 / 578 The train loss 0.303713121065
 83 / 578 The train loss 0.303161279983
 84 / 578 The train loss 0.303085266125
 85 / 578 The train loss 0.301937159195
 86 / 578 The train loss 0.300288227235
 87 / 578 The train loss 0.298835376727
 88 / 578 The train loss 0.29782445932
 89 / 578 The train loss 0.298577534684
 90 / 578 The train loss 0.30013079279
 91 / 578 The train loss 0.30141972644
 92 / 578 The train loss 0.3014316967
 93 / 578 The train loss 0.300915666806
 94 / 578 The train loss 0.299373526205
 95 / 578 The train loss 0.297749148231
 96 / 578 The train loss 0.297088483814
 97 / 578 The train loss 0.296877290017
 98 / 578 The train loss 0.298118478942
 99 / 578 The train loss 0.300299087709
 100 / 578 The train loss 0.303118702322
 101 / 578 The train loss 0.304696338159
 102 / 578 The train loss 0.303872570542
 103 / 578 The train loss 0.303453248537
 104 / 578 The train loss 0.303928750782
 105 / 578 The train loss 0.304136334005
 106 / 578 The train loss 0.304696375891
 107 / 578 The train loss 0.305056565137
 108 / 578 The train loss 0.30688495865
 109 / 578 The train loss 0.309691306511
 110 / 578 The train loss 0.309523866258
 111 / 578 The train loss 0.311020504113
 112 / 578 The train loss 0.311792281988
 113 / 578 The train loss 0.312045487814
 114 / 578 The train loss 0.313517398087
 115 / 578 The train loss 0.314814945796
 116 / 578 The train loss 0.317132362389
 117 / 578 The train loss 0.320111803902
 118 / 578 The train loss 0.323943573413
 119 / 578 The train loss 0.329404630456
 120 / 578 The train loss 0.333288448925
 121 / 578 The train loss 0.338159763493
 122 / 578 The train loss 0.343286926018
 123 / 578 The train loss 0.351044017488
 124 / 578 The train loss 0.361292103005
 125 / 578 The train loss 0.371584941268
 126 / 578 The train loss 0.38116372424
 127 / 578 The train loss 0.392086866686
 128 / 578 The train loss 0.402758778422
 129 / 578 The train loss 0.414625478692
 130 / 578 The train loss 0.428024305174
 131 / 578 The train loss 0.43787434299
 132 / 578 The train loss 0.442678477164
 133 / 578 The train loss 0.444932037614
 134 / 578 The train loss 0.446516144076
 135 / 578 The train loss 0.450106933271
 136 / 578 The train loss 0.455187159317
 137 / 578 The train loss 0.460371424573
 138 / 578 The train loss 0.463160760485
 139 / 578 The train loss 0.4658042007
 140 / 578 The train loss 0.46705431076
 141 / 578 The train loss 0.467391615113
 142 / 578 The train loss 0.467903237842
 143 / 578 The train loss 0.469614225563
 144 / 578 The train loss 0.472060795356
 145 / 578 The train loss 0.47324827447
 146 / 578 The train loss 0.474017959241
 147 / 578 The train loss 0.474011410357
 148 / 578 The train loss 0.473725590654
 149 / 578 The train loss 0.473600581088
 150 / 578 The train loss 0.47412418435
 151 / 578 The train loss 0.472671338955
 152 / 578 The train loss 0.47134429736
 153 / 578 The train loss 0.470456857794
 154 / 578 The train loss 0.46946498471
 155 / 578 The train loss 0.469528264865
 156 / 578 The train loss 0.469928327757
 157 / 578 The train loss 0.470833854406
 158 / 578 The train loss 0.472257232081
 159 / 578 The train loss 0.473017053128
 160 / 578 The train loss 0.473097492103
 161 / 578 The train loss 0.472300076429
 162 / 578 The train loss 0.47242046957
 163 / 578 The train loss 0.472602029245
 164 / 578 The train loss 0.472480124544
 165 / 578 The train loss 0.471730420355
 166 / 578 The train loss 0.47144219773
 167 / 578 The train loss 0.471188934888
 168 / 578 The train loss 0.473648665828
 169 / 578 The train loss 0.478795296254
 170 / 578 The train loss 0.486863739964
 171 / 578 The train loss 0.49397994543
 172 / 578 The train loss 0.499667085794
 173 / 578 The train loss 0.506144971431
 174 / 578 The train loss 0.51438918804
 175 / 578 The train loss 0.520788541436
 176 / 578 The train loss 0.527301016094
 177 / 578 The train loss 0.531811522815
 178 / 578 The train loss 0.535075665106
 179 / 578 The train loss 0.538823095031
 180 / 578 The train loss 0.542200564593
 181 / 578 The train loss 0.544558169351
 182 / 578 The train loss 0.545623917419
 183 / 578 The train loss 0.546993440536
 184 / 578 The train loss 0.551764495914
 185 / 578 The train loss 0.557521292889
 186 / 578 The train loss 0.570879860511
 187 / 578 The train loss 0.588565629992
 188 / 578 The train loss 0.60379695504
 189 / 578 The train loss 0.616499372181
 190 / 578 The train loss 0.632003413925
 191 / 578 The train loss 0.644923282465
 192 / 578 The train loss 0.656842821511
 193 / 578 The train loss 0.667112529509
 194 / 578 The train loss 0.67460159535
 195 / 578 The train loss 0.678933520118
 196 / 578 The train loss 0.683704879135
 197 / 578 The train loss 0.691371891629
 198 / 578 The train loss 0.695130027229
 199 / 578 The train loss 0.699180781467
 200 / 578 The train loss 0.704269750044
 201 / 578 The train loss 0.713848593149
 202 / 578 The train loss 0.728205827129
 203 / 578 The train loss 0.740272743875
 204 / 578 The train loss 0.749833096403
 205 / 578 The train loss 0.756452010245
 206 / 578 The train loss 0.75975119771
 207 / 578 The train loss 0.762821278062
 208 / 578 The train loss 0.771033676699
 209 / 578 The train loss 0.782010526677
 210 / 578 The train loss 0.78999755489
 211 / 578 The train loss 0.796734639183
 212 / 578 The train loss 0.803386886095
 213 / 578 The train loss 0.809608458283
 214 / 578 The train loss 0.814040996691
 215 / 578 The train loss 0.817935486658
 216 / 578 The train loss 0.821032472714
 217 / 578 The train loss 0.822292116312
 218 / 578 The train loss 0.824016759087
 219 / 578 The train loss 0.826812064743
 220 / 578 The train loss 0.829710317674
 221 / 578 The train loss 0.833871347804
 222 / 578 The train loss 0.837919125082
 223 / 578 The train loss 0.842254639674
 224 / 578 The train loss 0.846780111681
 225 / 578 The train loss 0.854029669166
 226 / 578 The train loss 0.861799362421
 227 / 578 The train loss 0.869124710757
 228 / 578 The train loss 0.874426386793
 229 / 578 The train loss 0.880692478806
 230 / 578 The train loss 0.886911554894
 231 / 578 The train loss 0.893217324114
 232 / 578 The train loss 0.898513766134
 233 / 578 The train loss 0.904062571891
 234 / 578 The train loss 0.910720734451
 235 / 578 The train loss 0.91922141434
 236 / 578 The train loss 0.924496851672
 237 / 578 The train loss 0.925650761537
 238 / 578 The train loss 0.925076751586
 239 / 578 The train loss 0.924946717076
 240 / 578 The train loss 0.925549200612
 241 / 578 The train loss 0.927205178191
 242 / 578 The train loss 0.929632509481
 243 / 578 The train loss 0.932009726334
 244 / 578 The train loss 0.93537494583
 245 / 578 The train loss 0.940716593545
 246 / 578 The train loss 0.945506816896
 247 / 578 The train loss 0.947076435453
 248 / 578 The train loss 0.948605849317
 249 / 578 The train loss 0.950703143535
 250 / 578 The train loss 0.954853717744
 251 / 578 The train loss 0.959139605383
 252 / 578 The train loss 0.963885344151
 253 / 578 The train loss 0.970825473252
 254 / 578 The train loss 0.980385698613
 255 / 578 The train loss 0.990481887086
 256 / 578 The train loss 1.00271268183
 257 / 578 The train loss 1.01382757341
 258 / 578 The train loss 1.02330581064
 259 / 578 The train loss 1.0288149417
 260 / 578 The train loss 1.03408712877
 261 / 578 The train loss 1.03940809178
 262 / 578 The train loss 1.04420892058
 263 / 578 The train loss 1.04900441327
 264 / 578 The train loss 1.05547956168
 265 / 578 The train loss 1.06088992239
 266 / 578 The train loss 1.06731364865
 267 / 578 The train loss 1.07405857999
 268 / 578 The train loss 1.08228694764
 269 / 578 The train loss 1.09176159077
 270 / 578 The train loss 1.10355501346
 271 / 578 The train loss 1.11349708474
 272 / 578 The train loss 1.12236233576
 273 / 578 The train loss 1.13164720862
 274 / 578 The train loss 1.14075772861
 275 / 578 The train loss 1.14955613911
 276 / 578 The train loss 1.15788460636
 277 / 578 The train loss 1.16475572563
 278 / 578 The train loss 1.1724445159
 279 / 578 The train loss 1.18026555904
 280 / 578 The train loss 1.18956363622
 281 / 578 The train loss 1.19952603331
 282 / 578 The train loss 1.20770939524
 283 / 578 The train loss 1.21626285103
 284 / 578 The train loss 1.22478228382
 285 / 578 The train loss 1.23277425426
 286 / 578 The train loss 1.2400544081
 287 / 578 The train loss 1.24737174414
 288 / 578 The train loss 1.25457898771
 289 / 578 The train loss 1.26163235993
 290 / 578 The train loss 1.26867332125
 291 / 578 The train loss 1.27635123225
 292 / 578 The train loss 1.28502020684
 293 / 578 The train loss 1.2928420911
 294 / 578 The train loss 1.29987602407
 295 / 578 The train loss 1.3072320679
 296 / 578 The train loss 1.313424882
 297 / 578 The train loss 1.31888879665
 298 / 578 The train loss 1.32399127142
 299 / 578 The train loss 1.32966924745
 300 / 578 The train loss 1.33653313314
 301 / 578 The train loss 1.34251298614
 302 / 578 The train loss 1.34883016067
 303 / 578 The train loss 1.35458592215
 304 / 578 The train loss 1.35986470701
 305 / 578 The train loss 1.36442580765
 306 / 578 The train loss 1.36847371053
 307 / 578 The train loss 1.37293585577
 308 / 578 The train loss 1.37721237729
 309 / 578 The train loss 1.38024585142
 310 / 578 The train loss 1.38091623701
 311 / 578 The train loss 1.37995887411
 312 / 578 The train loss 1.37798455658
 313 / 578 The train loss 1.375754948
 314 / 578 The train loss 1.37392989612
 315 / 578 The train loss 1.37254826072
 316 / 578 The train loss 1.37158854584
 317 / 578 The train loss 1.37096722866
 318 / 578 The train loss 1.37009012994
 319 / 578 The train loss 1.36895679152
 320 / 578 The train loss 1.36688739671
 321 / 578 The train loss 1.36376420309
 322 / 578 The train loss 1.36048505779
 323 / 578 The train loss 1.35702666031
 324 / 578 The train loss 1.35349482583
 325 / 578 The train loss 1.3499563816
 326 / 578 The train loss 1.34622567046
 327 / 578 The train loss 1.34252748007
 328 / 578 The train loss 1.33879218344
 329 / 578 The train loss 1.33537965137
 330 / 578 The train loss 1.33186484108
 331 / 578 The train loss 1.3283430683
 332 / 578 The train loss 1.32485364891
 333 / 578 The train loss 1.32164049357
 334 / 578 The train loss 1.31927538768
 335 / 578 The train loss 1.31800700117
 336 / 578 The train loss 1.31752034483
 337 / 578 The train loss 1.31785035551
 338 / 578 The train loss 1.31894520465
 339 / 578 The train loss 1.31963977484
 340 / 578 The train loss 1.31863917842
 341 / 578 The train loss 1.31628848234
 342 / 578 The train loss 1.313835583
 343 / 578 The train loss 1.31140301068
 344 / 578 The train loss 1.3091624025
 345 / 578 The train loss 1.30718124943
 346 / 578 The train loss 1.30512872556
 347 / 578 The train loss 1.30358159038
 348 / 578 The train loss 1.30228859351
 349 / 578 The train loss 1.30151631286
 350 / 578 The train loss 1.30095872498
 351 / 578 The train loss 1.30016417884
 352 / 578 The train loss 1.29950726839
 353 / 578 The train loss 1.29861430839
 354 / 578 The train loss 1.29801577862
 355 / 578 The train loss 1.29733872172
 356 / 578 The train loss 1.29696224138
 357 / 578 The train loss 1.2967451523
 358 / 578 The train loss 1.29639168419
 359 / 578 The train loss 1.29541828218
 360 / 578 The train loss 1.29454967854
 361 / 578 The train loss 1.29372138274
 362 / 578 The train loss 1.29250973161
 363 / 578 The train loss 1.29102756918
 364 / 578 The train loss 1.28935748857
 365 / 578 The train loss 1.28774967579
 366 / 578 The train loss 1.28458150881
 367 / 578 The train loss 1.28129503267
 368 / 578 The train loss 1.27805851031
 369 / 578 The train loss 1.27476920292
 370 / 578 The train loss 1.2715237578
 371 / 578 The train loss 1.26840277346
 372 / 578 The train loss 1.26530045822
 373 / 578 The train loss 1.2622729119
 374 / 578 The train loss 1.25920696186
 375 / 578 The train loss 1.25625469522
 376 / 578 The train loss 1.25329092704
 377 / 578 The train loss 1.25033488423
 378 / 578 The train loss 1.24736317302
 379 / 578 The train loss 1.24446725004
 380 / 578 The train loss 1.24167561562
 381 / 578 The train loss 1.23915055652
 382 / 578 The train loss 1.23687944729
 383 / 578 The train loss 1.23455453254
 384 / 578 The train loss 1.23215456881
 385 / 578 The train loss 1.22966956458
 386 / 578 The train loss 1.22694032678
 387 / 578 The train loss 1.22419191235
 388 / 578 The train loss 1.22147807694
 389 / 578 The train loss 1.21892908376
 390 / 578 The train loss 1.21615725985
 391 / 578 The train loss 1.2133354242
 392 / 578 The train loss 1.21055522869
 393 / 578 The train loss 1.20780301598
 394 / 578 The train loss 1.2049866223
 395 / 578 The train loss 1.20217246364
 396 / 578 The train loss 1.19985928902
 397 / 578 The train loss 1.19785334823
 398 / 578 The train loss 1.19624417912
 399 / 578 The train loss 1.19449371275
 400 / 578 The train loss 1.19274387863
 401 / 578 The train loss 1.19093902223
 402 / 578 The train loss 1.18924003762
 403 / 578 The train loss 1.18786574412
 404 / 578 The train loss 1.18647546782
 405 / 578 The train loss 1.1853338874
 406 / 578 The train loss 1.18411819567
 407 / 578 The train loss 1.18304497674
 408 / 578 The train loss 1.1819630224
 409 / 578 The train loss 1.18349276017
 410 / 578 The train loss 1.18691443986
 411 / 578 The train loss 1.18906784982
 412 / 578 The train loss 1.1900777666
 413 / 578 The train loss 1.19081665323
 414 / 578 The train loss 1.19210467761
 415 / 578 The train loss 1.19294096127
 416 / 578 The train loss 1.19271716071
 417 / 578 The train loss 1.19205286216
 418 / 578 The train loss 1.19168316102
 419 / 578 The train loss 1.19154257119
 420 / 578 The train loss 1.19157863667
 421 / 578 The train loss 1.19173449048
 422 / 578 The train loss 1.19164807575
 423 / 578 The train loss 1.19183845001
 424 / 578 The train loss 1.19170311863
 425 / 578 The train loss 1.19129520343
 426 / 578 The train loss 1.19034975652
 427 / 578 The train loss 1.18910317646
 428 / 578 The train loss 1.18800997814
 429 / 578 The train loss 1.18641946724
 430 / 578 The train loss 1.18453396909
 431 / 578 The train loss 1.18242623832
 432 / 578 The train loss 1.18025222679
 433 / 578 The train loss 1.17800650713
 434 / 578 The train loss 1.17558433734
 435 / 578 The train loss 1.1731749239
 436 / 578 The train loss 1.17092702042
 437 / 578 The train loss 1.16888041167
 438 / 578 The train loss 1.16742349382
 439 / 578 The train loss 1.16630469276
 440 / 578 The train loss 1.16518949436
 441 / 578 The train loss 1.16389926656
 442 / 578 The train loss 1.16246276814
 443 / 578 The train loss 1.16129499368
 444 / 578 The train loss 1.16085128664
 445 / 578 The train loss 1.16100060782
 446 / 578 The train loss 1.16161284161
 447 / 578 The train loss 1.16226709719
 448 / 578 The train loss 1.16279007099
 449 / 578 The train loss 1.16279425869
 450 / 578 The train loss 1.1625788832
 451 / 578 The train loss 1.16261007229
 452 / 578 The train loss 1.16243694133
 453 / 578 The train loss 1.16200727807
 454 / 578 The train loss 1.16136773282
 455 / 578 The train loss 1.16105873247
 456 / 578 The train loss 1.16058595194
 457 / 578 The train loss 1.1598514924
 458 / 578 The train loss 1.15901497478
 459 / 578 The train loss 1.15798786526
 460 / 578 The train loss 1.15752958788
 461 / 578 The train loss 1.15667505588
 462 / 578 The train loss 1.15581868934
 463 / 578 The train loss 1.15455914628
 464 / 578 The train loss 1.15266270065
 465 / 578 The train loss 1.15068500971
 466 / 578 The train loss 1.14846447809
 467 / 578 The train loss 1.14635546501
 468 / 578 The train loss 1.14440328184
 469 / 578 The train loss 1.14265673217
 470 / 578 The train loss 1.1407789351
 471 / 578 The train loss 1.13901095192
 472 / 578 The train loss 1.13704468046
 473 / 578 The train loss 1.13521015362
 474 / 578 The train loss 1.13331929351
 475 / 578 The train loss 1.13132935334
 476 / 578 The train loss 1.12954833689
 477 / 578 The train loss 1.12765552111
 478 / 578 The train loss 1.1256697674
 479 / 578 The train loss 1.12349100926
 480 / 578 The train loss 1.12135039736
 481 / 578 The train loss 1.11926238883
 482 / 578 The train loss 1.11715824455
 483 / 578 The train loss 1.11508427594
 484 / 578 The train loss 1.11307002142
 485 / 578 The train loss 1.11113490536
 486 / 578 The train loss 1.10917745138
 487 / 578 The train loss 1.10719293561
 488 / 578 The train loss 1.1053315122
 489 / 578 The train loss 1.10336438211
 490 / 578 The train loss 1.10136573765
 491 / 578 The train loss 1.0995356229
 492 / 578 The train loss 1.09761565642
 493 / 578 The train loss 1.0957015157
 494 / 578 The train loss 1.09373321555
 495 / 578 The train loss 1.09201102604
 496 / 578 The train loss 1.09045896948
 497 / 578 The train loss 1.08905314066
 498 / 578 The train loss 1.08758818647
 499 / 578 The train loss 1.08603458221
 500 / 578 The train loss 1.08427067368
 501 / 578 The train loss 1.08246826053
 502 / 578 The train loss 1.08082322917
 503 / 578 The train loss 1.07920881826
 504 / 578 The train loss 1.07763928014
 505 / 578 The train loss 1.07596923849
 506 / 578 The train loss 1.07434106502
 507 / 578 The train loss 1.07290328406
 508 / 578 The train loss 1.07117021534
 509 / 578 The train loss 1.06936819414
 510 / 578 The train loss 1.06752316737
 511 / 578 The train loss 1.06565786828
 512 / 578 The train loss 1.06386256317
 513 / 578 The train loss 1.06196459117
 514 / 578 The train loss 1.06016278157
 515 / 578 The train loss 1.05840609704
 516 / 578 The train loss 1.05667989337
 517 / 578 The train loss 1.05508447875
 518 / 578 The train loss 1.0534417135
 519 / 578 The train loss 1.05169803289
 520 / 578 The train loss 1.05009503021
 521 / 578 The train loss 1.04845964645
 522 / 578 The train loss 1.0466845541
 523 / 578 The train loss 1.04499381252
 524 / 578 The train loss 1.04343014474
 525 / 578 The train loss 1.04201297472
 526 / 578 The train loss 1.04059812043
 527 / 578 The train loss 1.03926926361
 528 / 578 The train loss 1.03793602912
 529 / 578 The train loss 1.03675731257
 530 / 578 The train loss 1.03541422685
 531 / 578 The train loss 1.03405722708
 532 / 578 The train loss 1.03280507388
 533 / 578 The train loss 1.03147959532
 534 / 578 The train loss 1.02999813206
 535 / 578 The train loss 1.02842283364
 536 / 578 The train loss 1.0269291249
 537 / 578 The train loss 1.02540784693
 538 / 578 The train loss 1.02409214381
 539 / 578 The train loss 1.02265296836
 540 / 578 The train loss 1.0212907969
 541 / 578 The train loss 1.02000726488
 542 / 578 The train loss 1.01873620879
 543 / 578 The train loss 1.01740749766
 544 / 578 The train loss 1.01618254137
 545 / 578 The train loss 1.01522224782
 546 / 578 The train loss 1.01544123955
 547 / 578 The train loss 1.01632642311
 548 / 578 The train loss 1.01863299018
 549 / 578 The train loss 1.01983500267
 550 / 578 The train loss 1.02016944735
 551 / 578 The train loss 1.02045104574
 552 / 578 The train loss 1.02047475094
 553 / 578 The train loss 1.02040753605
 554 / 578 The train loss 1.02032388151
 555 / 578 The train loss 1.01992375476
 556 / 578 The train loss 1.01950145895
 557 / 578 The train loss 1.01921487145
 558 / 578 The train loss 1.01880454474
 559 / 578 The train loss 1.01852468292
 560 / 578 The train loss 1.01840490828
 561 / 578 The train loss 1.01816697226
 562 / 578 The train loss 1.01802615839
 563 / 578 The train loss 1.01757349057
 564 / 578 The train loss 1.01629269817
 565 / 578 The train loss 1.01500532127
 566 / 578 The train loss 1.01364267935
 567 / 578 The train loss 1.01226791373
 568 / 578 The train loss 1.01095019723
 569 / 578 The train loss 1.0097163349
 570 / 578 The train loss 1.00845199908
 571 / 578 The train loss 1.0071522209
 572 / 578 The train loss 1.00600704135
 573 / 578 The train loss 1.00581721185
 574 / 578 The train loss 1.00559667948
 575 / 578 The train loss 1.00472088618
 576 / 578 The train loss 1.00414753003
 577 / 578 The train loss 1.00363094163
 578 / 578 The train loss 1.00301876913

Starting epoch 2
Validation:
 1 / 30 The valid loss 0.461336702108
 2 / 30 The valid loss 0.435881480575
 3 / 30 The valid loss 0.4136886398
 4 / 30 The valid loss 0.390344411135
 5 / 30 The valid loss 0.370161664486
 6 / 30 The valid loss 0.409936904907
 7 / 30 The valid loss 0.524121999741
 8 / 30 The valid loss 0.669056862593
 9 / 30 The valid loss 0.817215416167
 10 / 30 The valid loss 0.897582161427
 11 / 30 The valid loss 0.915576425466
 12 / 30 The valid loss 0.926673263311
 13 / 30 The valid loss 0.935562106279
 14 / 30 The valid loss 0.940325660365
 15 / 30 The valid loss 0.923230838776
 16 / 30 The valid loss 0.915900740772
 17 / 30 The valid loss 0.928968776675
 18 / 30 The valid loss 0.938702990611
 19 / 30 The valid loss 0.939050313674
 20 / 30 The valid loss 0.932860502601
 21 / 30 The valid loss 0.91891596147
 22 / 30 The valid loss 0.905958335508
 23 / 30 The valid loss 0.894222599009
 24 / 30 The valid loss 0.885527436932
 25 / 30 The valid loss 0.87723025322
 26 / 30 The valid loss 0.873675699417
 27 / 30 The valid loss 0.86287424741
 28 / 30 The valid loss 0.84337372865
 29 / 30 The valid loss 0.821272382962
 30 / 30 The valid loss 0.802814881504

Training
 1 / 578 The train loss 0.558399796486
 2 / 578 The train loss 0.418712854385
 3 / 578 The train loss 0.366999367873
 4 / 578 The train loss 0.351666674018
 5 / 578 The train loss 0.353314733505
 6 / 578 The train loss 0.326589487493
 7 / 578 The train loss 0.306191142116
 8 / 578 The train loss 0.285148799419
 9 / 578 The train loss 0.264974661171
 10 / 578 The train loss 0.248987425119
 11 / 578 The train loss 0.237382690338
 12 / 578 The train loss 0.228293043251
 13 / 578 The train loss 0.227047268588
 14 / 578 The train loss 0.235671762377
 15 / 578 The train loss 0.235612902542
 16 / 578 The train loss 0.236348654609
 17 / 578 The train loss 0.236073042978
 18 / 578 The train loss 0.230235346076
 19 / 578 The train loss 0.222131987936
 20 / 578 The train loss 0.216153321788
 21 / 578 The train loss 0.210352695059
 22 / 578 The train loss 0.204110494053
 23 / 578 The train loss 0.198330915816
 24 / 578 The train loss 0.193926135389
 25 / 578 The train loss 0.190317718685
 26 / 578 The train loss 0.187329408354
 27 / 578 The train loss 0.184907226375
 28 / 578 The train loss 0.183114247397
 29 / 578 The train loss 0.18211678878
 30 / 578 The train loss 0.18081832851
 31 / 578 The train loss 0.179189808667
 32 / 578 The train loss 0.177907507168
 33 / 578 The train loss 0.177620520872
 34 / 578 The train loss 0.178107526153
 35 / 578 The train loss 0.180562661162
 36 / 578 The train loss 0.18475609252
 37 / 578 The train loss 0.189101297912
 38 / 578 The train loss 0.19202723609
 39 / 578 The train loss 0.194857500876
 40 / 578 The train loss 0.195336298086
 41 / 578 The train loss 0.194339797809
 42 / 578 The train loss 0.194550953983
 43 / 578 The train loss 0.195772606446
 44 / 578 The train loss 0.197478291833
 45 / 578 The train loss 0.198671897418
 46 / 578 The train loss 0.198185191692
 47 / 578 The train loss 0.197031261598
 48 / 578 The train loss 0.19509179409
 49 / 578 The train loss 0.194621349628
 50 / 578 The train loss 0.196189802438
 51 / 578 The train loss 0.198354916391
 52 / 578 The train loss 0.201794657713
 53 / 578 The train loss 0.203563895006
 54 / 578 The train loss 0.203539409433
 55 / 578 The train loss 0.202953596142
 56 / 578 The train loss 0.202556957225
 57 / 578 The train loss 0.203445186348
 58 / 578 The train loss 0.206527813884
 59 / 578 The train loss 0.212467248283
 60 / 578 The train loss 0.219903799022
 61 / 578 The train loss 0.230489833555
 62 / 578 The train loss 0.240270563792
 63 / 578 The train loss 0.247651172181
 64 / 578 The train loss 0.254720282857
 65 / 578 The train loss 0.258422708167
 66 / 578 The train loss 0.259922391199
 67 / 578 The train loss 0.261355260081
 68 / 578 The train loss 0.262046592748
 69 / 578 The train loss 0.26328550873
 70 / 578 The train loss 0.266309702077
 71 / 578 The train loss 0.271238434168
 72 / 578 The train loss 0.277312217177
 73 / 578 The train loss 0.281841125901
 74 / 578 The train loss 0.285155364488
 75 / 578 The train loss 0.28978883634
 76 / 578 The train loss 0.292612786062
 77 / 578 The train loss 0.294519297972
 78 / 578 The train loss 0.297258624377
 79 / 578 The train loss 0.301816681989
 80 / 578 The train loss 0.303602646757
 81 / 578 The train loss 0.304713765872
 82 / 578 The train loss 0.304950392192
 83 / 578 The train loss 0.304445895176
 84 / 578 The train loss 0.305447069041
 85 / 578 The train loss 0.306102792743
 86 / 578 The train loss 0.305595723436
 87 / 578 The train loss 0.305092540024
 88 / 578 The train loss 0.303650896065
 89 / 578 The train loss 0.304267441373
 90 / 578 The train loss 0.304997255156
 91 / 578 The train loss 0.303472042493
 92 / 578 The train loss 0.303618387117
 93 / 578 The train loss 0.303366858873
 94 / 578 The train loss 0.301782582431
 95 / 578 The train loss 0.300650347612
 96 / 578 The train loss 0.298957133743
 97 / 578 The train loss 0.297803109685
 98 / 578 The train loss 0.297028964956
 99 / 578 The train loss 0.29736113315
 100 / 578 The train loss 0.298216663077
 101 / 578 The train loss 0.299906538338
 102 / 578 The train loss 0.30091483314
 103 / 578 The train loss 0.300978988481
 104 / 578 The train loss 0.301315823785
 105 / 578 The train loss 0.302327642909
 106 / 578 The train loss 0.302653550047
 107 / 578 The train loss 0.303574714775
 108 / 578 The train loss 0.304815631843
 109 / 578 The train loss 0.304597199032
 110 / 578 The train loss 0.304572571204
 111 / 578 The train loss 0.305621015663
 112 / 578 The train loss 0.307249283218
 113 / 578 The train loss 0.307955533332
 114 / 578 The train loss 0.310349997002
 115 / 578 The train loss 0.312382663139
 116 / 578 The train loss 0.314606910957
 117 / 578 The train loss 0.316667271284
 118 / 578 The train loss 0.320250408945
 119 / 578 The train loss 0.324130020535
 120 / 578 The train loss 0.328380768312
 121 / 578 The train loss 0.333321321417
 122 / 578 The train loss 0.33944781033
 123 / 578 The train loss 0.347331389482
 124 / 578 The train loss 0.357930227934
 125 / 578 The train loss 0.368646955788
 126 / 578 The train loss 0.377966906639
 127 / 578 The train loss 0.388195651077
 128 / 578 The train loss 0.399375931651
 129 / 578 The train loss 0.410742982702
 130 / 578 The train loss 0.423472664276
 131 / 578 The train loss 0.433024798015
 132 / 578 The train loss 0.43761580127
 133 / 578 The train loss 0.439827824278
 134 / 578 The train loss 0.441625532208
 135 / 578 The train loss 0.445155094564
 136 / 578 The train loss 0.449946636937
 137 / 578 The train loss 0.455080264948
 138 / 578 The train loss 0.458043258529
 139 / 578 The train loss 0.460433024993
 140 / 578 The train loss 0.461265152959
 141 / 578 The train loss 0.461101668999
 142 / 578 The train loss 0.461621429351
 143 / 578 The train loss 0.463345739027
 144 / 578 The train loss 0.465418568554
 145 / 578 The train loss 0.467190920125
 146 / 578 The train loss 0.468277735712
 147 / 578 The train loss 0.468150807735
 148 / 578 The train loss 0.467680303279
 149 / 578 The train loss 0.466863701058
 150 / 578 The train loss 0.467429949492
 151 / 578 The train loss 0.465873101837
 152 / 578 The train loss 0.464537963026
 153 / 578 The train loss 0.46334506284
 154 / 578 The train loss 0.46242469196
 155 / 578 The train loss 0.461958652879
 156 / 578 The train loss 0.462270657938
 157 / 578 The train loss 0.46337973763
 158 / 578 The train loss 0.464307356814
 159 / 578 The train loss 0.464830333285
 160 / 578 The train loss 0.46500689066
 161 / 578 The train loss 0.464903874196
 162 / 578 The train loss 0.464772736723
 163 / 578 The train loss 0.464572037427
 164 / 578 The train loss 0.464037403782
 165 / 578 The train loss 0.463787068336
 166 / 578 The train loss 0.46424246943
 167 / 578 The train loss 0.46360984226
 168 / 578 The train loss 0.466522688445
 169 / 578 The train loss 0.472362396765
 170 / 578 The train loss 0.479389224815
 171 / 578 The train loss 0.48604889689
 172 / 578 The train loss 0.491593007908
 173 / 578 The train loss 0.49870236808
 174 / 578 The train loss 0.505110902134
 175 / 578 The train loss 0.512258161604
 176 / 578 The train loss 0.519194281025
 177 / 578 The train loss 0.523506641851
 178 / 578 The train loss 0.526663960826
 179 / 578 The train loss 0.53104111315
 180 / 578 The train loss 0.534748077186
 181 / 578 The train loss 0.536537721465
 182 / 578 The train loss 0.537440334498
 183 / 578 The train loss 0.538782826986
 184 / 578 The train loss 0.543359924067
 185 / 578 The train loss 0.54936481082
 186 / 578 The train loss 0.560190251237
 187 / 578 The train loss 0.57615492222
 188 / 578 The train loss 0.591020382901
 189 / 578 The train loss 0.603169915971
 190 / 578 The train loss 0.620216398921
 191 / 578 The train loss 0.633823344026
 192 / 578 The train loss 0.645815514455
 193 / 578 The train loss 0.656387917543
 194 / 578 The train loss 0.664195083274
 195 / 578 The train loss 0.669005346719
 196 / 578 The train loss 0.673541274416
 197 / 578 The train loss 0.681768138813
 198 / 578 The train loss 0.685643656499
 199 / 578 The train loss 0.690173667093
 200 / 578 The train loss 0.694944893532
 201 / 578 The train loss 0.704879374461
 202 / 578 The train loss 0.718328184826
 203 / 578 The train loss 0.730600869531
 204 / 578 The train loss 0.738832670501
 205 / 578 The train loss 0.744857618685
 206 / 578 The train loss 0.74752481799
 207 / 578 The train loss 0.751137576061
 208 / 578 The train loss 0.759106187114
 209 / 578 The train loss 0.770393675416
 210 / 578 The train loss 0.778190512139
 211 / 578 The train loss 0.784194072099
 212 / 578 The train loss 0.791234961482
 213 / 578 The train loss 0.797285690379
 214 / 578 The train loss 0.802022182879
 215 / 578 The train loss 0.805911004647
 216 / 578 The train loss 0.809057505422
 217 / 578 The train loss 0.81057122209
 218 / 578 The train loss 0.812427907738
 219 / 578 The train loss 0.815118487397
 220 / 578 The train loss 0.818091259592
 221 / 578 The train loss 0.821851668686
 222 / 578 The train loss 0.825554965685
 223 / 578 The train loss 0.829952475034
 224 / 578 The train loss 0.835151168218
 225 / 578 The train loss 0.841031916241
 226 / 578 The train loss 0.848361498849
 227 / 578 The train loss 0.855710145496
 228 / 578 The train loss 0.862285235831
 229 / 578 The train loss 0.868140245987
 230 / 578 The train loss 0.874681151788
 231 / 578 The train loss 0.880315805765
 232 / 578 The train loss 0.885908277224
 233 / 578 The train loss 0.892370328723
 234 / 578 The train loss 0.898692569512
 235 / 578 The train loss 0.906906359595
 236 / 578 The train loss 0.912203114259
 237 / 578 The train loss 0.913164800575
 238 / 578 The train loss 0.912791910386
 239 / 578 The train loss 0.91272489605
 240 / 578 The train loss 0.912975668007
 241 / 578 The train loss 0.914101958368
 242 / 578 The train loss 0.916083708867
 243 / 578 The train loss 0.918237125524
 244 / 578 The train loss 0.921757193465
 245 / 578 The train loss 0.927213767688
 246 / 578 The train loss 0.931739736954
 247 / 578 The train loss 0.933580563226
 248 / 578 The train loss 0.934859913774
 249 / 578 The train loss 0.936379379148
 250 / 578 The train loss 0.94030096063
 251 / 578 The train loss 0.945134628431
 252 / 578 The train loss 0.949859522727
 253 / 578 The train loss 0.956568184666
 254 / 578 The train loss 0.965662685993
 255 / 578 The train loss 0.976496096306
 256 / 578 The train loss 0.988695502369
 257 / 578 The train loss 1.00000065133
 258 / 578 The train loss 1.00947171967
 259 / 578 The train loss 1.01463779896
 260 / 578 The train loss 1.02008750264
 261 / 578 The train loss 1.02535886974
 262 / 578 The train loss 1.02971019644
 263 / 578 The train loss 1.03460025615
 264 / 578 The train loss 1.04058764929
 265 / 578 The train loss 1.04626929543
 266 / 578 The train loss 1.05210167492
 267 / 578 The train loss 1.05846071341
 268 / 578 The train loss 1.06701237327
 269 / 578 The train loss 1.07753896633
 270 / 578 The train loss 1.0891568847
 271 / 578 The train loss 1.09859770699
 272 / 578 The train loss 1.10745863362
 273 / 578 The train loss 1.11663113679
 274 / 578 The train loss 1.12585261364
 275 / 578 The train loss 1.13496354285
 276 / 578 The train loss 1.14264073933
 277 / 578 The train loss 1.15043427806
 278 / 578 The train loss 1.15861263986
 279 / 578 The train loss 1.16671781736
 280 / 578 The train loss 1.17628145992
 281 / 578 The train loss 1.18590238773
 282 / 578 The train loss 1.19459314473
 283 / 578 The train loss 1.20405772726
 284 / 578 The train loss 1.21242361345
 285 / 578 The train loss 1.22015631834
 286 / 578 The train loss 1.22790149883
 287 / 578 The train loss 1.23502703336
 288 / 578 The train loss 1.24220156926
 289 / 578 The train loss 1.24936872625
 290 / 578 The train loss 1.25732813711
 291 / 578 The train loss 1.26482652479
 292 / 578 The train loss 1.27429931057
 293 / 578 The train loss 1.28232892226
 294 / 578 The train loss 1.29041035653
 295 / 578 The train loss 1.29753432443
 296 / 578 The train loss 1.30433934854
 297 / 578 The train loss 1.30993752937
 298 / 578 The train loss 1.31552256921
 299 / 578 The train loss 1.32220687363
 300 / 578 The train loss 1.32945075678
 301 / 578 The train loss 1.33491705192
 302 / 578 The train loss 1.34172750711
 303 / 578 The train loss 1.34893118595
 304 / 578 The train loss 1.35434652402
 305 / 578 The train loss 1.35925214572
 306 / 578 The train loss 1.36330323367
 307 / 578 The train loss 1.3677818706
 308 / 578 The train loss 1.37229574585
 309 / 578 The train loss 1.37639552883
 310 / 578 The train loss 1.37737214765
 311 / 578 The train loss 1.37678808039
 312 / 578 The train loss 1.37533247574
 313 / 578 The train loss 1.37326397937
 314 / 578 The train loss 1.37156437772
 315 / 578 The train loss 1.37039628528
 316 / 578 The train loss 1.36969163606
 317 / 578 The train loss 1.36963251231
 318 / 578 The train loss 1.36928120544
 319 / 578 The train loss 1.36800406278
 320 / 578 The train loss 1.36627483859
 321 / 578 The train loss 1.36332254031
 322 / 578 The train loss 1.35972860661
 323 / 578 The train loss 1.35628880927
 324 / 578 The train loss 1.3527658363
 325 / 578 The train loss 1.34921782638
 326 / 578 The train loss 1.34531647321
 327 / 578 The train loss 1.34153398794
 328 / 578 The train loss 1.33778481826
 329 / 578 The train loss 1.33422188683
 330 / 578 The train loss 1.33055415853
 331 / 578 The train loss 1.32701129716
 332 / 578 The train loss 1.32343219728
 333 / 578 The train loss 1.32035205104
 334 / 578 The train loss 1.31760152661
 335 / 578 The train loss 1.31645083379
 336 / 578 The train loss 1.31601208009
 337 / 578 The train loss 1.31642654339
 338 / 578 The train loss 1.31695991177
 339 / 578 The train loss 1.31666802362
 340 / 578 The train loss 1.31597545519
 341 / 578 The train loss 1.31345231342
 342 / 578 The train loss 1.31081673316
 343 / 578 The train loss 1.30841108822
 344 / 578 The train loss 1.30607388935
 345 / 578 The train loss 1.30397614176
 346 / 578 The train loss 1.30199848301
 347 / 578 The train loss 1.30028823838
 348 / 578 The train loss 1.29918492239
 349 / 578 The train loss 1.29795411361
 350 / 578 The train loss 1.29725249533
 351 / 578 The train loss 1.29672265057
 352 / 578 The train loss 1.29594138574
 353 / 578 The train loss 1.29476648455
 354 / 578 The train loss 1.2936818196
 355 / 578 The train loss 1.29292721215
 356 / 578 The train loss 1.29260694114
 357 / 578 The train loss 1.29262278298
 358 / 578 The train loss 1.29218072003
 359 / 578 The train loss 1.29122043142
 360 / 578 The train loss 1.29009950215
 361 / 578 The train loss 1.28964058799
 362 / 578 The train loss 1.28841071749
 363 / 578 The train loss 1.28671614793
 364 / 578 The train loss 1.28487663728
 365 / 578 The train loss 1.2835739982
 366 / 578 The train loss 1.28031696801
 367 / 578 The train loss 1.27702744841
 368 / 578 The train loss 1.2738372768
 369 / 578 The train loss 1.27063568364
 370 / 578 The train loss 1.26738598991
 371 / 578 The train loss 1.26420315952
 372 / 578 The train loss 1.26109340152
 373 / 578 The train loss 1.25796920539
 374 / 578 The train loss 1.25490922433
 375 / 578 The train loss 1.2519362721
 376 / 578 The train loss 1.24896677659
 377 / 578 The train loss 1.24595794352
 378 / 578 The train loss 1.24293763155
 379 / 578 The train loss 1.23994283005
 380 / 578 The train loss 1.237006018
 381 / 578 The train loss 1.23443495457
 382 / 578 The train loss 1.23213676457
 383 / 578 The train loss 1.2297486422
 384 / 578 The train loss 1.22735577764
 385 / 578 The train loss 1.2247734698
 386 / 578 The train loss 1.22197386387
 387 / 578 The train loss 1.21922596484
 388 / 578 The train loss 1.21653138766
 389 / 578 The train loss 1.21389476603
 390 / 578 The train loss 1.21116188042
 391 / 578 The train loss 1.20836400401
 392 / 578 The train loss 1.2056021377
 393 / 578 The train loss 1.20279547559
 394 / 578 The train loss 1.200050251
 395 / 578 The train loss 1.19723392021
 396 / 578 The train loss 1.19483692739
 397 / 578 The train loss 1.19289231434
 398 / 578 The train loss 1.19108984778
 399 / 578 The train loss 1.18920530052
 400 / 578 The train loss 1.1872985225
 401 / 578 The train loss 1.18537344627
 402 / 578 The train loss 1.1835678376
 403 / 578 The train loss 1.18218575257
 404 / 578 The train loss 1.18099660513
 405 / 578 The train loss 1.1800083573
 406 / 578 The train loss 1.17903486986
 407 / 578 The train loss 1.17782880639
 408 / 578 The train loss 1.17660010422
 409 / 578 The train loss 1.17785956725
 410 / 578 The train loss 1.18109406355
 411 / 578 The train loss 1.18339018961
 412 / 578 The train loss 1.18451522397
 413 / 578 The train loss 1.18528236811
 414 / 578 The train loss 1.18675047136
 415 / 578 The train loss 1.18767624797
 416 / 578 The train loss 1.18747411416
 417 / 578 The train loss 1.18694426364
 418 / 578 The train loss 1.18668184602
 419 / 578 The train loss 1.18641632943
 420 / 578 The train loss 1.18646252601
 421 / 578 The train loss 1.18643112044
 422 / 578 The train loss 1.18627818718
 423 / 578 The train loss 1.18646121524
 424 / 578 The train loss 1.18636702133
 425 / 578 The train loss 1.18611168123
 426 / 578 The train loss 1.18560965805
 427 / 578 The train loss 1.18451971407
 428 / 578 The train loss 1.18329550874
 429 / 578 The train loss 1.18187685979
 430 / 578 The train loss 1.17995884458
 431 / 578 The train loss 1.17789138641
 432 / 578 The train loss 1.17594524706
 433 / 578 The train loss 1.17374128463
 434 / 578 The train loss 1.17132319527
 435 / 578 The train loss 1.16893936865
 436 / 578 The train loss 1.16663545261
 437 / 578 The train loss 1.16479218136
 438 / 578 The train loss 1.16339466119
 439 / 578 The train loss 1.16234409205
 440 / 578 The train loss 1.16110609221
 441 / 578 The train loss 1.15985952706
 442 / 578 The train loss 1.15856751132
 443 / 578 The train loss 1.15746655549
 444 / 578 The train loss 1.15714269279
 445 / 578 The train loss 1.15714732763
 446 / 578 The train loss 1.15776011755
 447 / 578 The train loss 1.15808532375
 448 / 578 The train loss 1.15843607425
 449 / 578 The train loss 1.15831070773
 450 / 578 The train loss 1.15816294531
 451 / 578 The train loss 1.15797623688
 452 / 578 The train loss 1.15782543087
 453 / 578 The train loss 1.1575723199
 454 / 578 The train loss 1.15737071272
 455 / 578 The train loss 1.15701966318
 456 / 578 The train loss 1.1563271541
 457 / 578 The train loss 1.15574424044
 458 / 578 The train loss 1.15488877417
 459 / 578 The train loss 1.15412820392
 460 / 578 The train loss 1.15362833369
 461 / 578 The train loss 1.15295382339
 462 / 578 The train loss 1.15211302087
 463 / 578 The train loss 1.1509164481
 464 / 578 The train loss 1.14879700233
 465 / 578 The train loss 1.14662180037
 466 / 578 The train loss 1.14445289905
 467 / 578 The train loss 1.14227631519
 468 / 578 The train loss 1.14028692252
 469 / 578 The train loss 1.13829890955
 470 / 578 The train loss 1.13640300381
 471 / 578 The train loss 1.13453622118
 472 / 578 The train loss 1.13262037526
 473 / 578 The train loss 1.13066822602
 474 / 578 The train loss 1.12863328943
 475 / 578 The train loss 1.12665896275
 476 / 578 The train loss 1.12472902041
 477 / 578 The train loss 1.12279062973
 478 / 578 The train loss 1.12064350538
 479 / 578 The train loss 1.11849891705
 480 / 578 The train loss 1.1163949296
 481 / 578 The train loss 1.11438459317
 482 / 578 The train loss 1.11232794087
 483 / 578 The train loss 1.11024655863
 484 / 578 The train loss 1.10819587542
 485 / 578 The train loss 1.10621046021
 486 / 578 The train loss 1.10425588216
 487 / 578 The train loss 1.10229338167
 488 / 578 The train loss 1.10035995476
 489 / 578 The train loss 1.09842331717
 490 / 578 The train loss 1.09642019527
 491 / 578 The train loss 1.09457140912
 492 / 578 The train loss 1.09269793953
 493 / 578 The train loss 1.09086718738
 494 / 578 The train loss 1.08894455418
 495 / 578 The train loss 1.08702855736
 496 / 578 The train loss 1.08548405646
 497 / 578 The train loss 1.08410647709
 498 / 578 The train loss 1.0826052018
 499 / 578 The train loss 1.08103745465
 500 / 578 The train loss 1.07934940031
 501 / 578 The train loss 1.07753155031
 502 / 578 The train loss 1.07582550905
 503 / 578 The train loss 1.07410857775
 504 / 578 The train loss 1.07241580026
 505 / 578 The train loss 1.07080444428
 506 / 578 The train loss 1.06902293549
 507 / 578 The train loss 1.06733957959
 508 / 578 The train loss 1.06553852543
 509 / 578 The train loss 1.06370521704
 510 / 578 The train loss 1.06182965499
 511 / 578 The train loss 1.05991846227
 512 / 578 The train loss 1.05808546713
 513 / 578 The train loss 1.05620458689
 514 / 578 The train loss 1.05440996667
 515 / 578 The train loss 1.05265330781
 516 / 578 The train loss 1.05093462357
 517 / 578 The train loss 1.04924504181
 518 / 578 The train loss 1.04754151906
 519 / 578 The train loss 1.04586506362
 520 / 578 The train loss 1.04423210379
 521 / 578 The train loss 1.04265040358
 522 / 578 The train loss 1.04095272002
 523 / 578 The train loss 1.03927172807
 524 / 578 The train loss 1.03782134564
 525 / 578 The train loss 1.03632002628
 526 / 578 The train loss 1.03481905096
 527 / 578 The train loss 1.03347780861
 528 / 578 The train loss 1.03219227031
 529 / 578 The train loss 1.0310779289
 530 / 578 The train loss 1.02993694017
 531 / 578 The train loss 1.02857306017
 532 / 578 The train loss 1.02724851428
 533 / 578 The train loss 1.02594002775
 534 / 578 The train loss 1.02442479163
 535 / 578 The train loss 1.02282419663
 536 / 578 The train loss 1.0212595724
 537 / 578 The train loss 1.01972911335
 538 / 578 The train loss 1.01835336623
 539 / 578 The train loss 1.01698750624
 540 / 578 The train loss 1.01559224881
 541 / 578 The train loss 1.0143220135
 542 / 578 The train loss 1.01308614044
 543 / 578 The train loss 1.01161585192
 544 / 578 The train loss 1.01030731833
 545 / 578 The train loss 1.00937459913
 546 / 578 The train loss 1.00952907454
 547 / 578 The train loss 1.01009332182
 548 / 578 The train loss 1.01228178751
 549 / 578 The train loss 1.01336506445
 550 / 578 The train loss 1.01375477017
 551 / 578 The train loss 1.01385664146
 552 / 578 The train loss 1.01383528379
 553 / 578 The train loss 1.01381421225
 554 / 578 The train loss 1.01374288174
 555 / 578 The train loss 1.01335532189
 556 / 578 The train loss 1.01290314464
 557 / 578 The train loss 1.01258843737
 558 / 578 The train loss 1.01218341923
 559 / 578 The train loss 1.01188076562
 560 / 578 The train loss 1.01167729929
 561 / 578 The train loss 1.0114710814
 562 / 578 The train loss 1.01136855517
 563 / 578 The train loss 1.01083021103
 564 / 578 The train loss 1.00951952668
 565 / 578 The train loss 1.00820760649
 566 / 578 The train loss 1.00695345597
 567 / 578 The train loss 1.00566487638
 568 / 578 The train loss 1.00438606048
 569 / 578 The train loss 1.00317438244
 570 / 578 The train loss 1.0018955143
 571 / 578 The train loss 1.00056694217
 572 / 578 The train loss 0.999437198733
 573 / 578 The train loss 0.999288154691
 574 / 578 The train loss 0.998949831021
 575 / 578 The train loss 0.998135726413
 576 / 578 The train loss 0.997545235752
 577 / 578 The train loss 0.996879726661
 578 / 578 The train loss 0.996237571535

Starting epoch 3
Validation:
 1 / 30 The valid loss 0.460784345865
 2 / 30 The valid loss 0.435379177332
 3 / 30 The valid loss 0.413092037042
 4 / 30 The valid loss 0.389243915677
 5 / 30 The valid loss 0.368766969442
 6 / 30 The valid loss 0.408340786894
 7 / 30 The valid loss 0.522320044892
 8 / 30 The valid loss 0.667099159211
 9 / 30 The valid loss 0.815135594871
 10 / 30 The valid loss 0.895409092307
 11 / 30 The valid loss 0.91331414472
 12 / 30 The valid loss 0.924322215219
 13 / 30 The valid loss 0.933136616762
 14 / 30 The valid loss 0.93784418915
 15 / 30 The valid loss 0.920772598187
 16 / 30 The valid loss 0.913457227871
 17 / 30 The valid loss 0.926504596191
 18 / 30 The valid loss 0.936211067769
 19 / 30 The valid loss 0.93654053619
 20 / 30 The valid loss 0.930362452567
 21 / 30 The valid loss 0.91641057531
 22 / 30 The valid loss 0.90345176648
 23 / 30 The valid loss 0.891725046479
 24 / 30 The valid loss 0.883043119063
 25 / 30 The valid loss 0.874727605581
 26 / 30 The valid loss 0.87115476567
 27 / 30 The valid loss 0.860347996155
 28 / 30 The valid loss 0.840866930783
 29 / 30 The valid loss 0.818836713146
 30 / 30 The valid loss 0.8004650111

Training
 1 / 578 The train loss 0.614065885544
 2 / 578 The train loss 0.476967617869
 3 / 578 The train loss 0.415765513976
 4 / 578 The train loss 0.399163730443
 5 / 578 The train loss 0.383987414837
 6 / 578 The train loss 0.35549774766
 7 / 578 The train loss 0.331041719232
 8 / 578 The train loss 0.308947680518
 9 / 578 The train loss 0.283599363433
 10 / 578 The train loss 0.265755201131
 11 / 578 The train loss 0.251788311384
 12 / 578 The train loss 0.242237719397
 13 / 578 The train loss 0.24025823749
 14 / 578 The train loss 0.250552926745
 15 / 578 The train loss 0.250160469611
 16 / 578 The train loss 0.248722155578
 17 / 578 The train loss 0.246597067398
 18 / 578 The train loss 0.241028394136
 19 / 578 The train loss 0.233169952506
 20 / 578 The train loss 0.225467022881
 21 / 578 The train loss 0.219148678794
 22 / 578 The train loss 0.213323353028
 23 / 578 The train loss 0.206918138849
 24 / 578 The train loss 0.201768238408
 25 / 578 The train loss 0.197846720219
 26 / 578 The train loss 0.193780043377
 27 / 578 The train loss 0.190702177032
 28 / 578 The train loss 0.190020251753
 29 / 578 The train loss 0.18766199746
 30 / 578 The train loss 0.185453070452
 31 / 578 The train loss 0.184135371879
 32 / 578 The train loss 0.182501239935
 33 / 578 The train loss 0.182071424569
 34 / 578 The train loss 0.181629082955
 35 / 578 The train loss 0.183626706472
 36 / 578 The train loss 0.188804254971
 37 / 578 The train loss 0.192396355843
 38 / 578 The train loss 0.196234547974
 39 / 578 The train loss 0.19885657785
 40 / 578 The train loss 0.199245774932
 41 / 578 The train loss 0.198731678833
 42 / 578 The train loss 0.198325996065
 43 / 578 The train loss 0.200369747052
 44 / 578 The train loss 0.20174545473
 45 / 578 The train loss 0.201784265869
 46 / 578 The train loss 0.201246832538
 47 / 578 The train loss 0.199988799685
 48 / 578 The train loss 0.198833692043
 49 / 578 The train loss 0.19827601268
 50 / 578 The train loss 0.199726875573
 51 / 578 The train loss 0.201854070758
 52 / 578 The train loss 0.203492381968
 53 / 578 The train loss 0.205316673051
 54 / 578 The train loss 0.205834687998
 55 / 578 The train loss 0.206061345474
 56 / 578 The train loss 0.205665258558
 57 / 578 The train loss 0.208047471679
 58 / 578 The train loss 0.210510137395
 59 / 578 The train loss 0.215123169629
 60 / 578 The train loss 0.222657546525
 61 / 578 The train loss 0.231936108016
 62 / 578 The train loss 0.24265329023
 63 / 578 The train loss 0.250900905402
 64 / 578 The train loss 0.256956911762
 65 / 578 The train loss 0.261441518825
 66 / 578 The train loss 0.262745701115
 67 / 578 The train loss 0.264490897642
 68 / 578 The train loss 0.265307185076
 69 / 578 The train loss 0.267504804916
 70 / 578 The train loss 0.270725204796
 71 / 578 The train loss 0.275247933684
 72 / 578 The train loss 0.281438201355
 73 / 578 The train loss 0.286593643464
 74 / 578 The train loss 0.289959570544
 75 / 578 The train loss 0.295233533879
 76 / 578 The train loss 0.297087038132
 77 / 578 The train loss 0.297961751072
 78 / 578 The train loss 0.300335040937
 79 / 578 The train loss 0.30263920383
 80 / 578 The train loss 0.30436828332
 81 / 578 The train loss 0.30476193452
 82 / 578 The train loss 0.304374274412
 83 / 578 The train loss 0.304726614381
 84 / 578 The train loss 0.304825896752
 85 / 578 The train loss 0.304254216043
 86 / 578 The train loss 0.30303326483
 87 / 578 The train loss 0.301752749103
 88 / 578 The train loss 0.300465503962
 89 / 578 The train loss 0.301748309326
 90 / 578 The train loss 0.301837408625
 91 / 578 The train loss 0.300489790083
 92 / 578 The train loss 0.299201052231
 93 / 578 The train loss 0.298251189291
 94 / 578 The train loss 0.296591506597
 95 / 578 The train loss 0.294618247136
 96 / 578 The train loss 0.293719963869
 97 / 578 The train loss 0.292828062069
 98 / 578 The train loss 0.292900338085
 99 / 578 The train loss 0.293090826258
 100 / 578 The train loss 0.293728661165
 101 / 578 The train loss 0.294658549868
 102 / 578 The train loss 0.29561593683
 103 / 578 The train loss 0.295361629317
 104 / 578 The train loss 0.295381637362
 105 / 578 The train loss 0.295128427162
 106 / 578 The train loss 0.295315816647
 107 / 578 The train loss 0.295631485495
 108 / 578 The train loss 0.295494644385
 109 / 578 The train loss 0.295384775266
 110 / 578 The train loss 0.295414047553
 111 / 578 The train loss 0.297104064073
 112 / 578 The train loss 0.29734034803
 113 / 578 The train loss 0.297855558583
 114 / 578 The train loss 0.3010874986
 115 / 578 The train loss 0.303302286436
 116 / 578 The train loss 0.306342575506
 117 / 578 The train loss 0.309714782888
 118 / 578 The train loss 0.313028063246
 119 / 578 The train loss 0.316074636741
 120 / 578 The train loss 0.320164307393
 121 / 578 The train loss 0.323888598826
 122 / 578 The train loss 0.328147595473
 123 / 578 The train loss 0.33621165853
 124 / 578 The train loss 0.346618328784
 125 / 578 The train loss 0.35648457092
 126 / 578 The train loss 0.366140475763
 127 / 578 The train loss 0.377661435095
 128 / 578 The train loss 0.387370251527
 129 / 578 The train loss 0.398448763372
 130 / 578 The train loss 0.411060867344
 131 / 578 The train loss 0.42090650227
 132 / 578 The train loss 0.425487572892
 133 / 578 The train loss 0.428028197609
 134 / 578 The train loss 0.429770802773
 135 / 578 The train loss 0.433003847191
 136 / 578 The train loss 0.437705249854
 137 / 578 The train loss 0.4432253911
 138 / 578 The train loss 0.446123758466
 139 / 578 The train loss 0.448040357543
 140 / 578 The train loss 0.448876552443
 141 / 578 The train loss 0.448466110684
 142 / 578 The train loss 0.448553120744
 143 / 578 The train loss 0.450397105223
 144 / 578 The train loss 0.451991128166
 145 / 578 The train loss 0.453155828087
 146 / 578 The train loss 0.453497248715
 147 / 578 The train loss 0.453494687216
 148 / 578 The train loss 0.452586498162
 149 / 578 The train loss 0.451922184619
 150 / 578 The train loss 0.451553986818
 151 / 578 The train loss 0.449645935453
 152 / 578 The train loss 0.448030414697
 153 / 578 The train loss 0.446031894024
 154 / 578 The train loss 0.44449129154
 155 / 578 The train loss 0.443596000585
 156 / 578 The train loss 0.443575429563
 157 / 578 The train loss 0.443563519722
 158 / 578 The train loss 0.443411166108
 159 / 578 The train loss 0.442915146034
 160 / 578 The train loss 0.442259458592
 161 / 578 The train loss 0.441126152919
 162 / 578 The train loss 0.44037752272
 163 / 578 The train loss 0.439702435825
 164 / 578 The train loss 0.438692219692
 165 / 578 The train loss 0.438028864743
 166 / 578 The train loss 0.436820705029
 167 / 578 The train loss 0.435466204798
 168 / 578 The train loss 0.438998896673
 169 / 578 The train loss 0.446692519154
 170 / 578 The train loss 0.455205994012
 171 / 578 The train loss 0.462913894035
 172 / 578 The train loss 0.469457807453
 173 / 578 The train loss 0.475324103813
 174 / 578 The train loss 0.482164681857
 175 / 578 The train loss 0.490928834038
 176 / 578 The train loss 0.497729835397
 177 / 578 The train loss 0.502780321838
 178 / 578 The train loss 0.507170145641
 179 / 578 The train loss 0.511547504399
 180 / 578 The train loss 0.515074030765
 181 / 578 The train loss 0.517694205416
 182 / 578 The train loss 0.519325705225
 183 / 578 The train loss 0.520785239428
 184 / 578 The train loss 0.525817145674
 185 / 578 The train loss 0.531382501649
 186 / 578 The train loss 0.540756923096
 187 / 578 The train loss 0.554984207659
 188 / 578 The train loss 0.567190734828
 189 / 578 The train loss 0.578590987418
 190 / 578 The train loss 0.593595922817
 191 / 578 The train loss 0.606570768302
 192 / 578 The train loss 0.617902777041
 193 / 578 The train loss 0.62724021081
 194 / 578 The train loss 0.634246737264
 195 / 578 The train loss 0.638643607918
 196 / 578 The train loss 0.643478935052
 197 / 578 The train loss 0.651429205566
 198 / 578 The train loss 0.655439120539
 199 / 578 The train loss 0.659570797791
 200 / 578 The train loss 0.66528198678
 201 / 578 The train loss 0.674504299847
 202 / 578 The train loss 0.687484579105
 203 / 578 The train loss 0.69951746811
 204 / 578 The train loss 0.708733292954
 205 / 578 The train loss 0.715686669764
 206 / 578 The train loss 0.718980315422
 207 / 578 The train loss 0.721937688832
 208 / 578 The train loss 0.729568780794
 209 / 578 The train loss 0.740989027019
 210 / 578 The train loss 0.748542145498
 211 / 578 The train loss 0.754473902214
 212 / 578 The train loss 0.761446035309
 213 / 578 The train loss 0.767714613655
 214 / 578 The train loss 0.773165458448
 215 / 578 The train loss 0.777203764091
 216 / 578 The train loss 0.78013522002
 217 / 578 The train loss 0.781633437126
 218 / 578 The train loss 0.783103061003
 219 / 578 The train loss 0.785420015148
 220 / 578 The train loss 0.788014259291
 221 / 578 The train loss 0.791673860028
 222 / 578 The train loss 0.795825122344
 223 / 578 The train loss 0.799767396008
 224 / 578 The train loss 0.80440679901
 225 / 578 The train loss 0.81126231369
 226 / 578 The train loss 0.818731263313
 227 / 578 The train loss 0.826397139234
 228 / 578 The train loss 0.832505132709
 229 / 578 The train loss 0.838767662482
 230 / 578 The train loss 0.84476372849
 231 / 578 The train loss 0.851421988533
 232 / 578 The train loss 0.856838110866
 233 / 578 The train loss 0.862743516497
 234 / 578 The train loss 0.869573101688
 235 / 578 The train loss 0.878940605261
 236 / 578 The train loss 0.884835675264
 237 / 578 The train loss 0.886330384903
 238 / 578 The train loss 0.886239569634
 239 / 578 The train loss 0.886473150589
 240 / 578 The train loss 0.88727878
 241 / 578 The train loss 0.889048892587
 242 / 578 The train loss 0.89175532983
 243 / 578 The train loss 0.89415936868
 244 / 578 The train loss 0.897913412145
 245 / 578 The train loss 0.903044197663
 246 / 578 The train loss 0.908260892229
 247 / 578 The train loss 0.910583894896
 248 / 578 The train loss 0.912370392542
 249 / 578 The train loss 0.914036174047
 250 / 578 The train loss 0.918596166283
 251 / 578 The train loss 0.923604779658
 252 / 578 The train loss 0.928611855333
 253 / 578 The train loss 0.935227125206
 254 / 578 The train loss 0.944385747887
 255 / 578 The train loss 0.955847556597
 256 / 578 The train loss 0.969138335116
 257 / 578 The train loss 0.980773597057
 258 / 578 The train loss 0.990051889287
 259 / 578 The train loss 0.996065989108
 260 / 578 The train loss 1.00179619024
 261 / 578 The train loss 1.00732909094
 262 / 578 The train loss 1.01219048087
 263 / 578 The train loss 1.01720772556
 264 / 578 The train loss 1.02406634134
 265 / 578 The train loss 1.03051964621
 266 / 578 The train loss 1.03666959459
 267 / 578 The train loss 1.04380562212
 268 / 578 The train loss 1.05260606902
 269 / 578 The train loss 1.06300234321
 270 / 578 The train loss 1.07497739585
 271 / 578 The train loss 1.08536706169
 272 / 578 The train loss 1.09437041042
 273 / 578 The train loss 1.10418581496
 274 / 578 The train loss 1.11358023791
 275 / 578 The train loss 1.12265153508
 276 / 578 The train loss 1.13112011825
 277 / 578 The train loss 1.13917091233
 278 / 578 The train loss 1.1477456493
 279 / 578 The train loss 1.1566263413
 280 / 578 The train loss 1.16730227867
 281 / 578 The train loss 1.17706128512
 282 / 578 The train loss 1.18642535671
 283 / 578 The train loss 1.19651861044
 284 / 578 The train loss 1.2048977132
 285 / 578 The train loss 1.21334854231
 286 / 578 The train loss 1.22153058307
 287 / 578 The train loss 1.22946154254
 288 / 578 The train loss 1.23706320479
 289 / 578 The train loss 1.24404100865
 290 / 578 The train loss 1.25216537562
 291 / 578 The train loss 1.26038124525
 292 / 578 The train loss 1.26946638925
 293 / 578 The train loss 1.27795291651
 294 / 578 The train loss 1.286616377
 295 / 578 The train loss 1.29470321498
 296 / 578 The train loss 1.3013695822
 297 / 578 The train loss 1.30808245024
 298 / 578 The train loss 1.31423607981
 299 / 578 The train loss 1.32120761158
 300 / 578 The train loss 1.32890780501
 301 / 578 The train loss 1.33616510313
 302 / 578 The train loss 1.34397620843
 303 / 578 The train loss 1.35080738953
 304 / 578 The train loss 1.35716846166
 305 / 578 The train loss 1.36252426371
 306 / 578 The train loss 1.36785282937
 307 / 578 The train loss 1.37308543486
 308 / 578 The train loss 1.3778928655
 309 / 578 The train loss 1.38188963543
 310 / 578 The train loss 1.38354047318
 311 / 578 The train loss 1.38344410045
 312 / 578 The train loss 1.38244550706
 313 / 578 The train loss 1.38065690423
 314 / 578 The train loss 1.37912019378
 315 / 578 The train loss 1.37786518253
 316 / 578 The train loss 1.378090573
 317 / 578 The train loss 1.37825968262
 318 / 578 The train loss 1.3779634767
 319 / 578 The train loss 1.37725235052
 320 / 578 The train loss 1.3758984878
 321 / 578 The train loss 1.37307940477
 322 / 578 The train loss 1.36974810455
 323 / 578 The train loss 1.36628883517
 324 / 578 The train loss 1.36267977362
 325 / 578 The train loss 1.35883925266
 326 / 578 The train loss 1.35487038784
 327 / 578 The train loss 1.35100263321
 328 / 578 The train loss 1.34720430362
 329 / 578 The train loss 1.34350415446
 330 / 578 The train loss 1.33987968964
 331 / 578 The train loss 1.3361486994
 332 / 578 The train loss 1.33240924493
 333 / 578 The train loss 1.32924660096
 334 / 578 The train loss 1.32655628788
 335 / 578 The train loss 1.32520998472
 336 / 578 The train loss 1.32511829855
 337 / 578 The train loss 1.32564892529
 338 / 578 The train loss 1.32568149829
 339 / 578 The train loss 1.32577163372
 340 / 578 The train loss 1.3245826797
 341 / 578 The train loss 1.322126037
 342 / 578 The train loss 1.31929480032
 343 / 578 The train loss 1.31696222662
 344 / 578 The train loss 1.314619131
 345 / 578 The train loss 1.31255348087
 346 / 578 The train loss 1.31050440148
 347 / 578 The train loss 1.30869492137
 348 / 578 The train loss 1.30757471812
 349 / 578 The train loss 1.30714547425
 350 / 578 The train loss 1.30664917805
 351 / 578 The train loss 1.30593986249
 352 / 578 The train loss 1.30517434257
 353 / 578 The train loss 1.30424995526
 354 / 578 The train loss 1.30279679601
 355 / 578 The train loss 1.30233146586
 356 / 578 The train loss 1.30203319446
 357 / 578 The train loss 1.30140408193
 358 / 578 The train loss 1.30078773138
 359 / 578 The train loss 1.29987616854
 360 / 578 The train loss 1.29904112464
 361 / 578 The train loss 1.29822571766
 362 / 578 The train loss 1.29704077134
 363 / 578 The train loss 1.29556823085
 364 / 578 The train loss 1.2936503294
 365 / 578 The train loss 1.29195111676
 366 / 578 The train loss 1.28866425081
 367 / 578 The train loss 1.28535294833
 368 / 578 The train loss 1.28208512274
 369 / 578 The train loss 1.27879707418
 370 / 578 The train loss 1.2754633921
 371 / 578 The train loss 1.27219129558
 372 / 578 The train loss 1.26899918979
 373 / 578 The train loss 1.265880112
 374 / 578 The train loss 1.26278529371
 375 / 578 The train loss 1.25969959568
 376 / 578 The train loss 1.2565823527
 377 / 578 The train loss 1.25349017526
 378 / 578 The train loss 1.25042640405
 379 / 578 The train loss 1.24745057575
 380 / 578 The train loss 1.24445798292
 381 / 578 The train loss 1.24177394824
 382 / 578 The train loss 1.23935231166
 383 / 578 The train loss 1.23692402754
 384 / 578 The train loss 1.23435546455
 385 / 578 The train loss 1.23180718806
 386 / 578 The train loss 1.22901390853
 387 / 578 The train loss 1.22615311829
 388 / 578 The train loss 1.2233684267
 389 / 578 The train loss 1.22074071482
 390 / 578 The train loss 1.21802642664
 391 / 578 The train loss 1.21517466172
 392 / 578 The train loss 1.21235627739
 393 / 578 The train loss 1.20959386958
 394 / 578 The train loss 1.2068428218
 395 / 578 The train loss 1.20399165654
 396 / 578 The train loss 1.20149708538
 397 / 578 The train loss 1.19935005677
 398 / 578 The train loss 1.19745704194
 399 / 578 The train loss 1.19551200443
 400 / 578 The train loss 1.19359778977
 401 / 578 The train loss 1.19162367629
 402 / 578 The train loss 1.18977248549
 403 / 578 The train loss 1.18841995316
 404 / 578 The train loss 1.18702702209
 405 / 578 The train loss 1.18581778958
 406 / 578 The train loss 1.18463043625
 407 / 578 The train loss 1.18343002174
 408 / 578 The train loss 1.18230574139
 409 / 578 The train loss 1.18381573736
 410 / 578 The train loss 1.18696426458
 411 / 578 The train loss 1.18905867262
 412 / 578 The train loss 1.18995636976
 413 / 578 The train loss 1.19076676763
 414 / 578 The train loss 1.1920625517
 415 / 578 The train loss 1.19317969858
 416 / 578 The train loss 1.19281515894
 417 / 578 The train loss 1.19209286291
 418 / 578 The train loss 1.19142657004
 419 / 578 The train loss 1.19129922139
 420 / 578 The train loss 1.19126396093
 421 / 578 The train loss 1.19113135422
 422 / 578 The train loss 1.19090328057
 423 / 578 The train loss 1.19088977814
 424 / 578 The train loss 1.19089827447
 425 / 578 The train loss 1.19033289894
 426 / 578 The train loss 1.18946992245
 427 / 578 The train loss 1.1883565514
 428 / 578 The train loss 1.18705533152
 429 / 578 The train loss 1.18559505111
 430 / 578 The train loss 1.18355534007
 431 / 578 The train loss 1.18153899845
 432 / 578 The train loss 1.17950532516
 433 / 578 The train loss 1.17718541955
 434 / 578 The train loss 1.17473411631
 435 / 578 The train loss 1.17228764024
 436 / 578 The train loss 1.16990135421
 437 / 578 The train loss 1.16800341756
 438 / 578 The train loss 1.16659518113
 439 / 578 The train loss 1.16542494476
 440 / 578 The train loss 1.16422430401
 441 / 578 The train loss 1.16285837997
 442 / 578 The train loss 1.16164990628
 443 / 578 The train loss 1.16038655134
 444 / 578 The train loss 1.15983921896
 445 / 578 The train loss 1.16000152132
 446 / 578 The train loss 1.16060116649
 447 / 578 The train loss 1.16121408326
 448 / 578 The train loss 1.16137317637
 449 / 578 The train loss 1.16127959885
 450 / 578 The train loss 1.16108051475
 451 / 578 The train loss 1.16094336504
 452 / 578 The train loss 1.16074307081
 453 / 578 The train loss 1.16033200215
 454 / 578 The train loss 1.15998417299
 455 / 578 The train loss 1.15932682414
 456 / 578 The train loss 1.15885410134
 457 / 578 The train loss 1.15827902574
 458 / 578 The train loss 1.15752674934
 459 / 578 The train loss 1.15686921385
 460 / 578 The train loss 1.15658103956
 461 / 578 The train loss 1.1560081408
 462 / 578 The train loss 1.15503021987
 463 / 578 The train loss 1.15385051978
 464 / 578 The train loss 1.15173991412
 465 / 578 The train loss 1.14960893216
 466 / 578 The train loss 1.14746575661
 467 / 578 The train loss 1.14528132406
 468 / 578 The train loss 1.14315213874
 469 / 578 The train loss 1.14115557341
 470 / 578 The train loss 1.13918870921
 471 / 578 The train loss 1.13731904185
 472 / 578 The train loss 1.13543883
 473 / 578 The train loss 1.13343818063
 474 / 578 The train loss 1.13142170146
 475 / 578 The train loss 1.12944547662
 476 / 578 The train loss 1.12753914393
 477 / 578 The train loss 1.12546376911
 478 / 578 The train loss 1.12338861136
 479 / 578 The train loss 1.12135517988
 480 / 578 The train loss 1.11927563169
 481 / 578 The train loss 1.11724094829
 482 / 578 The train loss 1.11519453408
 483 / 578 The train loss 1.11314449624
 484 / 578 The train loss 1.11105538293
 485 / 578 The train loss 1.10908508136
 486 / 578 The train loss 1.1070290724
 487 / 578 The train loss 1.1051070126
 488 / 578 The train loss 1.10319906969
 489 / 578 The train loss 1.10125011005
 490 / 578 The train loss 1.09924149213
 491 / 578 The train loss 1.09725047313
 492 / 578 The train loss 1.09538109276
 493 / 578 The train loss 1.09358124307
 494 / 578 The train loss 1.09166232176
 495 / 578 The train loss 1.08988029899
 496 / 578 The train loss 1.08833878946
 497 / 578 The train loss 1.08681288969
 498 / 578 The train loss 1.08528138418
 499 / 578 The train loss 1.08375031738
 500 / 578 The train loss 1.08201321106
 501 / 578 The train loss 1.08015076045
 502 / 578 The train loss 1.07836364627
 503 / 578 The train loss 1.07667161716
 504 / 578 The train loss 1.07505953086
 505 / 578 The train loss 1.07336842157
 506 / 578 The train loss 1.07170306281
 507 / 578 The train loss 1.069947447
 508 / 578 The train loss 1.06807316483
 509 / 578 The train loss 1.0661359279
 510 / 578 The train loss 1.06420631014
 511 / 578 The train loss 1.06232618008
 512 / 578 The train loss 1.06044327681
 513 / 578 The train loss 1.05854060091
 514 / 578 The train loss 1.05670114544
 515 / 578 The train loss 1.05491619415
 516 / 578 The train loss 1.05325308804
 517 / 578 The train loss 1.0515909608
 518 / 578 The train loss 1.04985874515
 519 / 578 The train loss 1.04810881601
 520 / 578 The train loss 1.04650045729
 521 / 578 The train loss 1.04484622925
 522 / 578 The train loss 1.04307133467
 523 / 578 The train loss 1.04134544408
 524 / 578 The train loss 1.03988091985
 525 / 578 The train loss 1.03840424693
 526 / 578 The train loss 1.03692053391
 527 / 578 The train loss 1.0355600845
 528 / 578 The train loss 1.03433590881
 529 / 578 The train loss 1.03319656271
 530 / 578 The train loss 1.03197277809
 531 / 578 The train loss 1.03068536236
 532 / 578 The train loss 1.02933567624
 533 / 578 The train loss 1.02804302707
 534 / 578 The train loss 1.02657462832
 535 / 578 The train loss 1.02500710208
 536 / 578 The train loss 1.02339678357
 537 / 578 The train loss 1.02186918508
 538 / 578 The train loss 1.0205289543
 539 / 578 The train loss 1.01914835346
 540 / 578 The train loss 1.01776772141
 541 / 578 The train loss 1.01647264376
 542 / 578 The train loss 1.01528932038
 543 / 578 The train loss 1.0138735424
 544 / 578 The train loss 1.01274334088
 545 / 578 The train loss 1.01184570064
 546 / 578 The train loss 1.0120245469
 547 / 578 The train loss 1.01291336391
 548 / 578 The train loss 1.01528727483
 549 / 578 The train loss 1.01632003985
 550 / 578 The train loss 1.01638797171
 551 / 578 The train loss 1.01652749066
 552 / 578 The train loss 1.01637950115
 553 / 578 The train loss 1.01630777139
 554 / 578 The train loss 1.01622141566
 555 / 578 The train loss 1.01588765347
 556 / 578 The train loss 1.01533937794
 557 / 578 The train loss 1.01499100013
 558 / 578 The train loss 1.01468662962
 559 / 578 The train loss 1.01429329145
 560 / 578 The train loss 1.01410399842
 561 / 578 The train loss 1.01374854407
 562 / 578 The train loss 1.01350896593
 563 / 578 The train loss 1.0130036582
 564 / 578 The train loss 1.01173300192
 565 / 578 The train loss 1.01043478089
 566 / 578 The train loss 1.00914603463
 567 / 578 The train loss 1.00784620145
 568 / 578 The train loss 1.00652375987
 569 / 578 The train loss 1.00519539331
 570 / 578 The train loss 1.00395217576
 571 / 578 The train loss 1.00258864509
 572 / 578 The train loss 1.00144064949
 573 / 578 The train loss 1.00135986532
 574 / 578 The train loss 1.00101944598
 575 / 578 The train loss 1.00004999795
 576 / 578 The train loss 0.999411648055
 577 / 578 The train loss 0.998836374178
 578 / 578 The train loss 0.99821672392

Starting epoch 4
Validation:
 1 / 30 The valid loss 0.461691737175
 2 / 30 The valid loss 0.43730455637
 3 / 30 The valid loss 0.41611991326
 4 / 30 The valid loss 0.39541529119
 5 / 30 The valid loss 0.376833504438
 6 / 30 The valid loss 0.417710537712
 7 / 30 The valid loss 0.532934797662
 8 / 30 The valid loss 0.678643580526
 9 / 30 The valid loss 0.827360202869
 10 / 30 The valid loss 0.908146068454
 11 / 30 The valid loss 0.926607400179
 12 / 30 The valid loss 0.938200779259
 13 / 30 The valid loss 0.947503646979
 14 / 30 The valid loss 0.95257660108
 15 / 30 The valid loss 0.935394146045
 16 / 30 The valid loss 0.928010864183
 17 / 30 The valid loss 0.941201621995
 18 / 30 The valid loss 0.951099937161
 19 / 30 The valid loss 0.951552718878
 20 / 30 The valid loss 0.945320470631
 21 / 30 The valid loss 0.931401072513
 22 / 30 The valid loss 0.918460828337
 23 / 30 The valid loss 0.906691018654
 24 / 30 The valid loss 0.897944744676
 25 / 30 The valid loss 0.889753683805
 26 / 30 The valid loss 0.886297701643
 27 / 30 The valid loss 0.875546260013
 28 / 30 The valid loss 0.855954195772
 29 / 30 The valid loss 0.833514231546
 30 / 30 The valid loss 0.814645743867

Training
 1 / 578 The train loss 0.585343718529
 2 / 578 The train loss 0.462313354015
 3 / 578 The train loss 0.40808703502
 4 / 578 The train loss 0.396788150072
 5 / 578 The train loss 0.389216268063
 6 / 578 The train loss 0.355889151494
 7 / 578 The train loss 0.331060524498
 8 / 578 The train loss 0.303988270462
 9 / 578 The train loss 0.282097337147
 10 / 578 The train loss 0.265417277813
 11 / 578 The train loss 0.251362566921
 12 / 578 The train loss 0.241706396764
 13 / 578 The train loss 0.237673442524
 14 / 578 The train loss 0.243744613337
 15 / 578 The train loss 0.245355279744
 16 / 578 The train loss 0.245382917579
 17 / 578 The train loss 0.243230500204
 18 / 578 The train loss 0.237085392492
 19 / 578 The train loss 0.229620682958
 20 / 578 The train loss 0.2228332486
 21 / 578 The train loss 0.215263814444
 22 / 578 The train loss 0.208977617323
 23 / 578 The train loss 0.204022323956
 24 / 578 The train loss 0.198526477752
 25 / 578 The train loss 0.194224468172
 26 / 578 The train loss 0.190562610729
 27 / 578 The train loss 0.187417198111
 28 / 578 The train loss 0.186760930078
 29 / 578 The train loss 0.18367674176
 30 / 578 The train loss 0.181023115913
 31 / 578 The train loss 0.178986553223
 32 / 578 The train loss 0.177105327835
 33 / 578 The train loss 0.176595344462
 34 / 578 The train loss 0.177147342659
 35 / 578 The train loss 0.179962765958
 36 / 578 The train loss 0.183724402347
 37 / 578 The train loss 0.186922682902
 38 / 578 The train loss 0.190268734372
 39 / 578 The train loss 0.191756355266
 40 / 578 The train loss 0.192573825829
 41 / 578 The train loss 0.191670372537
 42 / 578 The train loss 0.191315604995
 43 / 578 The train loss 0.192820837505
 44 / 578 The train loss 0.194828347896
 45 / 578 The train loss 0.195846639905
 46 / 578 The train loss 0.195035662826
 47 / 578 The train loss 0.194106152559
 48 / 578 The train loss 0.192775078584
 49 / 578 The train loss 0.191936972646
 50 / 578 The train loss 0.193730748147
 51 / 578 The train loss 0.195898819642
 52 / 578 The train loss 0.197583833423
 53 / 578 The train loss 0.198134990093
 54 / 578 The train loss 0.198595368062
 55 / 578 The train loss 0.198635620827
 56 / 578 The train loss 0.198904960948
 57 / 578 The train loss 0.200512867356
 58 / 578 The train loss 0.20288128393
 59 / 578 The train loss 0.20641789055
 60 / 578 The train loss 0.212889377897
 61 / 578 The train loss 0.223172348664
 62 / 578 The train loss 0.232058742955
 63 / 578 The train loss 0.240868764856
 64 / 578 The train loss 0.247425542795
 65 / 578 The train loss 0.252840477343
 66 / 578 The train loss 0.255112083685
 67 / 578 The train loss 0.256695397865
 68 / 578 The train loss 0.257774278631
 69 / 578 The train loss 0.258914866417
 70 / 578 The train loss 0.262993855242
 71 / 578 The train loss 0.268179963277
 72 / 578 The train loss 0.274078358482
 73 / 578 The train loss 0.278646455338
 74 / 578 The train loss 0.282390812865
 75 / 578 The train loss 0.286825447182
 76 / 578 The train loss 0.289139685368
 77 / 578 The train loss 0.290915244295
 78 / 578 The train loss 0.292950536292
 79 / 578 The train loss 0.2958471693
 80 / 578 The train loss 0.297698792163
 81 / 578 The train loss 0.298365494443
 82 / 578 The train loss 0.298114898456
 83 / 578 The train loss 0.298333116773
 84 / 578 The train loss 0.298639304315
 85 / 578 The train loss 0.298801211136
 86 / 578 The train loss 0.298197670472
 87 / 578 The train loss 0.297289403776
 88 / 578 The train loss 0.296433810568
 89 / 578 The train loss 0.298584444553
 90 / 578 The train loss 0.299101209061
 91 / 578 The train loss 0.297689134521
 92 / 578 The train loss 0.295817790391
 93 / 578 The train loss 0.294172048489
 94 / 578 The train loss 0.292129690105
 95 / 578 The train loss 0.29072479739
 96 / 578 The train loss 0.28905224513
 97 / 578 The train loss 0.287470736685
 98 / 578 The train loss 0.286772334074
 99 / 578 The train loss 0.286142425028
 100 / 578 The train loss 0.285850884542
 101 / 578 The train loss 0.286392595139
 102 / 578 The train loss 0.285881089156
 103 / 578 The train loss 0.285575984968
 104 / 578 The train loss 0.285237567571
 105 / 578 The train loss 0.28511041566
 106 / 578 The train loss 0.28653531866
 107 / 578 The train loss 0.286515146981
 108 / 578 The train loss 0.286324569404
 109 / 578 The train loss 0.286697586947
 110 / 578 The train loss 0.286626179855
 111 / 578 The train loss 0.285911899765
 112 / 578 The train loss 0.285356475905
 113 / 578 The train loss 0.285737374254
 114 / 578 The train loss 0.286606711058
 115 / 578 The train loss 0.288583622095
 116 / 578 The train loss 0.290809825898
 117 / 578 The train loss 0.292960387328
 118 / 578 The train loss 0.296772112091
 119 / 578 The train loss 0.30110399544
 120 / 578 The train loss 0.305180313004
 121 / 578 The train loss 0.309410048233
 122 / 578 The train loss 0.315042939648
 123 / 578 The train loss 0.322093726477
 124 / 578 The train loss 0.332147059241
 125 / 578 The train loss 0.341961291611
 126 / 578 The train loss 0.352262098933
 127 / 578 The train loss 0.362181171012
 128 / 578 The train loss 0.373227197037
 129 / 578 The train loss 0.385044057529
 130 / 578 The train loss 0.397706343864
 131 / 578 The train loss 0.406372062407
 132 / 578 The train loss 0.410701497317
 133 / 578 The train loss 0.412791541028
 134 / 578 The train loss 0.414492094083
 135 / 578 The train loss 0.418254302884
 136 / 578 The train loss 0.422875770587
 137 / 578 The train loss 0.42893164594
 138 / 578 The train loss 0.43161276269
 139 / 578 The train loss 0.433568137423
 140 / 578 The train loss 0.434064354481
 141 / 578 The train loss 0.434127709386
 142 / 578 The train loss 0.434524265656
 143 / 578 The train loss 0.436595447674
 144 / 578 The train loss 0.438591304183
 145 / 578 The train loss 0.440223726579
 146 / 578 The train loss 0.441023014435
 147 / 578 The train loss 0.441408781423
 148 / 578 The train loss 0.44063657789
 149 / 578 The train loss 0.439899768655
 150 / 578 The train loss 0.439629734606
 151 / 578 The train loss 0.437840452701
 152 / 578 The train loss 0.436239349754
 153 / 578 The train loss 0.434315428381
 154 / 578 The train loss 0.432669918597
 155 / 578 The train loss 0.432207704888
 156 / 578 The train loss 0.431546575509
 157 / 578 The train loss 0.431298322501
 158 / 578 The train loss 0.431624040715
 159 / 578 The train loss 0.431371950269
 160 / 578 The train loss 0.430828844151
 161 / 578 The train loss 0.43001143676
 162 / 578 The train loss 0.429267270468
 163 / 578 The train loss 0.428737255465
 164 / 578 The train loss 0.42781678201
 165 / 578 The train loss 0.42722208965
 166 / 578 The train loss 0.426442416991
 167 / 578 The train loss 0.425128001965
 168 / 578 The train loss 0.427220278863
 169 / 578 The train loss 0.435300116312
 170 / 578 The train loss 0.443991631927
 171 / 578 The train loss 0.452639916095
 172 / 578 The train loss 0.460400343245
 173 / 578 The train loss 0.467278242413
 174 / 578 The train loss 0.474734508058
 175 / 578 The train loss 0.482360176657
 176 / 578 The train loss 0.489251132335
 177 / 578 The train loss 0.493874296925
 178 / 578 The train loss 0.496931500942
 179 / 578 The train loss 0.501182838817
 180 / 578 The train loss 0.504335781228
 181 / 578 The train loss 0.506572643984
 182 / 578 The train loss 0.507601623966
 183 / 578 The train loss 0.508992800906
 184 / 578 The train loss 0.513541966636
 185 / 578 The train loss 0.518888305933
 186 / 578 The train loss 0.5280717675
 187 / 578 The train loss 0.54219678375
 188 / 578 The train loss 0.556156072171
 189 / 578 The train loss 0.56896173004
 190 / 578 The train loss 0.585271643298
 191 / 578 The train loss 0.598338391729
 192 / 578 The train loss 0.60930641985
 193 / 578 The train loss 0.618352442787
 194 / 578 The train loss 0.625510782272
 195 / 578 The train loss 0.629919822361
 196 / 578 The train loss 0.6346041353
 197 / 578 The train loss 0.642667642062
 198 / 578 The train loss 0.647026412377
 199 / 578 The train loss 0.651336540491
 200 / 578 The train loss 0.656183385216
 201 / 578 The train loss 0.665519410068
 202 / 578 The train loss 0.679485105605
 203 / 578 The train loss 0.691313555051
 204 / 578 The train loss 0.700552313315
 205 / 578 The train loss 0.70704857369
 206 / 578 The train loss 0.710474272919
 207 / 578 The train loss 0.713946059094
 208 / 578 The train loss 0.72235004371
 209 / 578 The train loss 0.733302968505
 210 / 578 The train loss 0.741327935138
 211 / 578 The train loss 0.747588806207
 212 / 578 The train loss 0.755206133924
 213 / 578 The train loss 0.761582147313
 214 / 578 The train loss 0.766262442674
 215 / 578 The train loss 0.770603861885
 216 / 578 The train loss 0.773306164994
 217 / 578 The train loss 0.775282746601
 218 / 578 The train loss 0.777103054927
 219 / 578 The train loss 0.779615687963
 220 / 578 The train loss 0.782843977179
 221 / 578 The train loss 0.787188258073
 222 / 578 The train loss 0.791230087311
 223 / 578 The train loss 0.795832266689
 224 / 578 The train loss 0.800577453171
 225 / 578 The train loss 0.807915422618
 226 / 578 The train loss 0.815830796133
 227 / 578 The train loss 0.823280258818
 228 / 578 The train loss 0.829700264982
 229 / 578 The train loss 0.836554363793
 230 / 578 The train loss 0.842956098234
 231 / 578 The train loss 0.849218531572
 232 / 578 The train loss 0.854827444069
 233 / 578 The train loss 0.860852092089
 234 / 578 The train loss 0.867666707048
 235 / 578 The train loss 0.876209175809
 236 / 578 The train loss 0.881781518428
 237 / 578 The train loss 0.883390287275
 238 / 578 The train loss 0.88319379564
 239 / 578 The train loss 0.883699575946
 240 / 578 The train loss 0.884544350424
 241 / 578 The train loss 0.886435453544
 242 / 578 The train loss 0.889270208807
 243 / 578 The train loss 0.892637893711
 244 / 578 The train loss 0.896878663359
 245 / 578 The train loss 0.902421862465
 246 / 578 The train loss 0.907506617801
 247 / 578 The train loss 0.909414389084
 248 / 578 The train loss 0.911243158693
 249 / 578 The train loss 0.91319980987
 250 / 578 The train loss 0.917979822367
 251 / 578 The train loss 0.922701588873
 252 / 578 The train loss 0.927866860249
 253 / 578 The train loss 0.93508884296
 254 / 578 The train loss 0.944856646698
 255 / 578 The train loss 0.95640000289
 256 / 578 The train loss 0.96957715528
 257 / 578 The train loss 0.981345127732
 258 / 578 The train loss 0.990966529087
 259 / 578 The train loss 0.997094047777
 260 / 578 The train loss 1.00305731317
 261 / 578 The train loss 1.00893379831
 262 / 578 The train loss 1.01434189511
 263 / 578 The train loss 1.01954731762
 264 / 578 The train loss 1.02643555964
 265 / 578 The train loss 1.03312621946
 266 / 578 The train loss 1.03959643681
 267 / 578 The train loss 1.04675892428
 268 / 578 The train loss 1.05589739353
 269 / 578 The train loss 1.06662363742
 270 / 578 The train loss 1.07892972383
 271 / 578 The train loss 1.08940908497
 272 / 578 The train loss 1.09870944498
 273 / 578 The train loss 1.10800024722
 274 / 578 The train loss 1.11807166592
 275 / 578 The train loss 1.12823872238
 276 / 578 The train loss 1.13681612345
 277 / 578 The train loss 1.14493474738
 278 / 578 The train loss 1.15364747872
 279 / 578 The train loss 1.16292020174
 280 / 578 The train loss 1.17312531149
 281 / 578 The train loss 1.18378082647
 282 / 578 The train loss 1.19313383459
 283 / 578 The train loss 1.20279518557
 284 / 578 The train loss 1.21198764
 285 / 578 The train loss 1.2206419177
 286 / 578 The train loss 1.23000079287
 287 / 578 The train loss 1.23857045026
 288 / 578 The train loss 1.24618804721
 289 / 578 The train loss 1.25366744799
 290 / 578 The train loss 1.26210550458
 291 / 578 The train loss 1.27029146134
 292 / 578 The train loss 1.27889528113
 293 / 578 The train loss 1.28721253556
 294 / 578 The train loss 1.29544473847
 295 / 578 The train loss 1.30342463576
 296 / 578 The train loss 1.31085766104
 297 / 578 The train loss 1.31716403192
 298 / 578 The train loss 1.32337496202
 299 / 578 The train loss 1.33037777905
 300 / 578 The train loss 1.33794368284
 301 / 578 The train loss 1.34524387238
 302 / 578 The train loss 1.35337613925
 303 / 578 The train loss 1.36025153624
 304 / 578 The train loss 1.36692761451
 305 / 578 The train loss 1.372619769
 306 / 578 The train loss 1.37763075051
 307 / 578 The train loss 1.38253845626
 308 / 578 The train loss 1.387822622
 309 / 578 The train loss 1.39162082674
 310 / 578 The train loss 1.39346968414
 311 / 578 The train loss 1.39336763526
 312 / 578 The train loss 1.39233528133
 313 / 578 The train loss 1.39012690529
 314 / 578 The train loss 1.38849147563
 315 / 578 The train loss 1.38756608355
 316 / 578 The train loss 1.38733202556
 317 / 578 The train loss 1.38742961076
 318 / 578 The train loss 1.38712561241
 319 / 578 The train loss 1.38626749909
 320 / 578 The train loss 1.38450926428
 321 / 578 The train loss 1.38173897595
 322 / 578 The train loss 1.37843220831
 323 / 578 The train loss 1.37493949388
 324 / 578 The train loss 1.37120875438
 325 / 578 The train loss 1.36740113311
 326 / 578 The train loss 1.36340148009
 327 / 578 The train loss 1.35950890327
 328 / 578 The train loss 1.35567023444
 329 / 578 The train loss 1.35193574657
 330 / 578 The train loss 1.34820235527
 331 / 578 The train loss 1.34443757487
 332 / 578 The train loss 1.34076795749
 333 / 578 The train loss 1.33751465022
 334 / 578 The train loss 1.33487607499
 335 / 578 The train loss 1.33340532873
 336 / 578 The train loss 1.33284290193
 337 / 578 The train loss 1.33362336528
 338 / 578 The train loss 1.33441008852
 339 / 578 The train loss 1.33384772093
 340 / 578 The train loss 1.33220258395
 341 / 578 The train loss 1.32920420812
 342 / 578 The train loss 1.32624596728
 343 / 578 The train loss 1.32366018006
 344 / 578 The train loss 1.32134414113
 345 / 578 The train loss 1.31934630193
 346 / 578 The train loss 1.31738110098
 347 / 578 The train loss 1.315658711
 348 / 578 The train loss 1.31436841888
 349 / 578 The train loss 1.31353573672
 350 / 578 The train loss 1.31248559215
 351 / 578 The train loss 1.31142269491
 352 / 578 The train loss 1.31045504096
 353 / 578 The train loss 1.30917800904
 354 / 578 The train loss 1.30787609236
 355 / 578 The train loss 1.30703879591
 356 / 578 The train loss 1.30675557503
 357 / 578 The train loss 1.30670126692
 358 / 578 The train loss 1.30612039079
 359 / 578 The train loss 1.30537047737
 360 / 578 The train loss 1.30451588229
 361 / 578 The train loss 1.30404378895
 362 / 578 The train loss 1.3027147102
 363 / 578 The train loss 1.30107430629
 364 / 578 The train loss 1.29891518589
 365 / 578 The train loss 1.29709193653
 366 / 578 The train loss 1.29390060385
 367 / 578 The train loss 1.29059374223
 368 / 578 The train loss 1.28729642389
 369 / 578 The train loss 1.28405926447
 370 / 578 The train loss 1.28073724502
 371 / 578 The train loss 1.27743563227
 372 / 578 The train loss 1.2743522471
 373 / 578 The train loss 1.27118582854
 374 / 578 The train loss 1.26798363974
 375 / 578 The train loss 1.26492847939
 376 / 578 The train loss 1.26187980415
 377 / 578 The train loss 1.25879136125
 378 / 578 The train loss 1.25564540542
 379 / 578 The train loss 1.25267879824
 380 / 578 The train loss 1.24969333667
 381 / 578 The train loss 1.24696620378
 382 / 578 The train loss 1.24466157812
 383 / 578 The train loss 1.24225888312
 384 / 578 The train loss 1.23972309406
 385 / 578 The train loss 1.23712460563
 386 / 578 The train loss 1.23426490221
 387 / 578 The train loss 1.23145795475
 388 / 578 The train loss 1.22862453012
 389 / 578 The train loss 1.22591708802
 390 / 578 The train loss 1.22313721025
 391 / 578 The train loss 1.22030233159
 392 / 578 The train loss 1.2174824355
 393 / 578 The train loss 1.21469589988
 394 / 578 The train loss 1.21189152154
 395 / 578 The train loss 1.20901375134
 396 / 578 The train loss 1.20662361526
 397 / 578 The train loss 1.20455908895
 398 / 578 The train loss 1.20263308369
 399 / 578 The train loss 1.20066003714
 400 / 578 The train loss 1.1985044533
 401 / 578 The train loss 1.1966008836
 402 / 578 The train loss 1.19473777711
 403 / 578 The train loss 1.19314864426
 404 / 578 The train loss 1.19185901153
 405 / 578 The train loss 1.19065263176
 406 / 578 The train loss 1.18963039076
 407 / 578 The train loss 1.18840136099
 408 / 578 The train loss 1.18711806987
 409 / 578 The train loss 1.18829193362
 410 / 578 The train loss 1.19124084369
 411 / 578 The train loss 1.19341780068
 412 / 578 The train loss 1.19425649644
 413 / 578 The train loss 1.19500137047
 414 / 578 The train loss 1.19636872029
 415 / 578 The train loss 1.19741165473
 416 / 578 The train loss 1.19706923636
 417 / 578 The train loss 1.19629992934
 418 / 578 The train loss 1.19568010844
 419 / 578 The train loss 1.19546228754
 420 / 578 The train loss 1.1953869914
 421 / 578 The train loss 1.19525362456
 422 / 578 The train loss 1.19508150891
 423 / 578 The train loss 1.19523271975
 424 / 578 The train loss 1.19518704101
 425 / 578 The train loss 1.19466058991
 426 / 578 The train loss 1.19390793522
 427 / 578 The train loss 1.19268459382
 428 / 578 The train loss 1.19141954121
 429 / 578 The train loss 1.19000382197
 430 / 578 The train loss 1.18805064691
 431 / 578 The train loss 1.18586670409
 432 / 578 The train loss 1.18373236146
 433 / 578 The train loss 1.18143530851
 434 / 578 The train loss 1.17895752052
 435 / 578 The train loss 1.17656120105
 436 / 578 The train loss 1.17416485954
 437 / 578 The train loss 1.17225742621
 438 / 578 The train loss 1.17080561152
 439 / 578 The train loss 1.16944397944
 440 / 578 The train loss 1.16829274292
 441 / 578 The train loss 1.16686050362
 442 / 578 The train loss 1.16538976054
 443 / 578 The train loss 1.16405047811
 444 / 578 The train loss 1.16361880029
 445 / 578 The train loss 1.16385622448
 446 / 578 The train loss 1.16459056503
 447 / 578 The train loss 1.16514413944
 448 / 578 The train loss 1.16560732462
 449 / 578 The train loss 1.16561721476
 450 / 578 The train loss 1.16535328489
 451 / 578 The train loss 1.16509171674
 452 / 578 The train loss 1.16481526494
 453 / 578 The train loss 1.16435777518
 454 / 578 The train loss 1.16406211121
 455 / 578 The train loss 1.16343261012
 456 / 578 The train loss 1.16263761301
 457 / 578 The train loss 1.16208680657
 458 / 578 The train loss 1.16119730338
 459 / 578 The train loss 1.160346963
 460 / 578 The train loss 1.1597160896
 461 / 578 The train loss 1.15908608049
 462 / 578 The train loss 1.15824456179
 463 / 578 The train loss 1.15693728373
 464 / 578 The train loss 1.15491274582
 465 / 578 The train loss 1.15281157345
 466 / 578 The train loss 1.15066267651
 467 / 578 The train loss 1.1484151253
 468 / 578 The train loss 1.1463017559
 469 / 578 The train loss 1.14435644202
 470 / 578 The train loss 1.14249797563
 471 / 578 The train loss 1.14059029584
 472 / 578 The train loss 1.13864988472
 473 / 578 The train loss 1.13670225808
 474 / 578 The train loss 1.13465580534
 475 / 578 The train loss 1.13274641274
 476 / 578 The train loss 1.1307832833
 477 / 578 The train loss 1.12879118
 478 / 578 The train loss 1.12663684915
 479 / 578 The train loss 1.12446244818
 480 / 578 The train loss 1.1223501307
 481 / 578 The train loss 1.12021102011
 482 / 578 The train loss 1.11807282363
 483 / 578 The train loss 1.11593379255
 484 / 578 The train loss 1.11378283283
 485 / 578 The train loss 1.11172900847
 486 / 578 The train loss 1.10972423267
 487 / 578 The train loss 1.1077076456
 488 / 578 The train loss 1.10568853976
 489 / 578 The train loss 1.10371117844
 490 / 578 The train loss 1.10170628173
 491 / 578 The train loss 1.09986660006
 492 / 578 The train loss 1.09793974005
 493 / 578 The train loss 1.09596523303
 494 / 578 The train loss 1.09400381261
 495 / 578 The train loss 1.09224624118
 496 / 578 The train loss 1.09064922754
 497 / 578 The train loss 1.08921878962
 498 / 578 The train loss 1.0877183248
 499 / 578 The train loss 1.08613630402
 500 / 578 The train loss 1.08440936787
 501 / 578 The train loss 1.08255109835
 502 / 578 The train loss 1.0807157365
 503 / 578 The train loss 1.07904081149
 504 / 578 The train loss 1.07733898712
 505 / 578 The train loss 1.07561216539
 506 / 578 The train loss 1.07399495454
 507 / 578 The train loss 1.07238437857
 508 / 578 The train loss 1.07059418148
 509 / 578 The train loss 1.06871563766
 510 / 578 The train loss 1.06676501695
 511 / 578 The train loss 1.06489622472
 512 / 578 The train loss 1.06301970068
 513 / 578 The train loss 1.06107279395
 514 / 578 The train loss 1.05917411077
 515 / 578 The train loss 1.05737564115
 516 / 578 The train loss 1.05562693409
 517 / 578 The train loss 1.05398672736
 518 / 578 The train loss 1.05226717441
 519 / 578 The train loss 1.05056269263
 520 / 578 The train loss 1.04895661611
 521 / 578 The train loss 1.04729033243
 522 / 578 The train loss 1.04549012396
 523 / 578 The train loss 1.04378293201
 524 / 578 The train loss 1.04229309114
 525 / 578 The train loss 1.04088503236
 526 / 578 The train loss 1.03940143909
 527 / 578 The train loss 1.03806128674
 528 / 578 The train loss 1.03673409992
 529 / 578 The train loss 1.03553018937
 530 / 578 The train loss 1.03432523456
 531 / 578 The train loss 1.03296697942
 532 / 578 The train loss 1.03163403978
 533 / 578 The train loss 1.03031578811
 534 / 578 The train loss 1.02881214042
 535 / 578 The train loss 1.02724316736
 536 / 578 The train loss 1.02575736462
 537 / 578 The train loss 1.02425262102
 538 / 578 The train loss 1.0228261364
 539 / 578 The train loss 1.02142637551
 540 / 578 The train loss 1.02000555068
 541 / 578 The train loss 1.01874587659
 542 / 578 The train loss 1.01758744397
 543 / 578 The train loss 1.01618784965
 544 / 578 The train loss 1.01496716285
 545 / 578 The train loss 1.01405166817
 546 / 578 The train loss 1.01429527193
 547 / 578 The train loss 1.01508697734
 548 / 578 The train loss 1.01722104274
 549 / 578 The train loss 1.0182741099
 550 / 578 The train loss 1.01862213484
 551 / 578 The train loss 1.01857212302
 552 / 578 The train loss 1.01840906431
 553 / 578 The train loss 1.01831253186
 554 / 578 The train loss 1.01820187731
 555 / 578 The train loss 1.0177571719
 556 / 578 The train loss 1.01724343465
 557 / 578 The train loss 1.01682720324
 558 / 578 The train loss 1.01646947799
 559 / 578 The train loss 1.01620951399
 560 / 578 The train loss 1.01603848149
 561 / 578 The train loss 1.01591600053
 562 / 578 The train loss 1.01565385567
 563 / 578 The train loss 1.01526856594
 564 / 578 The train loss 1.0140200042
 565 / 578 The train loss 1.01273205209
 566 / 578 The train loss 1.01146081598
 567 / 578 The train loss 1.0101418168
 568 / 578 The train loss 1.00871619197
 569 / 578 The train loss 1.00746192436
 570 / 578 The train loss 1.00615323126
 571 / 578 The train loss 1.00482371997
 572 / 578 The train loss 1.00364650254
 573 / 578 The train loss 1.00345989807
 574 / 578 The train loss 1.00306046539
 575 / 578 The train loss 1.0022145828
 576 / 578 The train loss 1.00159908303
 577 / 578 The train loss 1.00087527159
 578 / 578 The train loss 1.00022822773

Starting epoch 5
Validation:
 1 / 30 The valid loss 0.461230546236
 2 / 30 The valid loss 0.436736941338
 3 / 30 The valid loss 0.415480285883
 4 / 30 The valid loss 0.394439183176
 5 / 30 The valid loss 0.375673300028
 6 / 30 The valid loss 0.416402960817
 7 / 30 The valid loss 0.531373624291
 8 / 30 The valid loss 0.676890257746
 9 / 30 The valid loss 0.825435588757
 10 / 30 The valid loss 0.90614079535
 11 / 30 The valid loss 0.924591294744
 12 / 30 The valid loss 0.936146430671
 13 / 30 The valid loss 0.945419139587
 14 / 30 The valid loss 0.950471641762
 15 / 30 The valid loss 0.933322288593
 16 / 30 The valid loss 0.925962751731
 17 / 30 The valid loss 0.939150542021
 18 / 30 The valid loss 0.949041212598
 19 / 30 The valid loss 0.949490122105
 20 / 30 The valid loss 0.943275134265
 21 / 30 The valid loss 0.929360655092
 22 / 30 The valid loss 0.916428743438
 23 / 30 The valid loss 0.904666598724
 24 / 30 The valid loss 0.895929029832
 25 / 30 The valid loss 0.887731829882
 26 / 30 The valid loss 0.884265987919
 27 / 30 The valid loss 0.873517986801
 28 / 30 The valid loss 0.853947629886
 29 / 30 The valid loss 0.831562151683
 30 / 30 The valid loss 0.812757950524

Training
 1 / 578 The train loss 0.582132935524
 2 / 578 The train loss 0.448431879282
 3 / 578 The train loss 0.399144252141
 4 / 578 The train loss 0.389005988836
 5 / 578 The train loss 0.388296306133
 6 / 578 The train loss 0.35618964086
 7 / 578 The train loss 0.327719115785
 8 / 578 The train loss 0.300493023358
 9 / 578 The train loss 0.276602326996
 10 / 578 The train loss 0.260494060069
 11 / 578 The train loss 0.250355704942
 12 / 578 The train loss 0.239170143381
 13 / 578 The train loss 0.234192600044
 14 / 578 The train loss 0.240105626306
 15 / 578 The train loss 0.239837565521
 16 / 578 The train loss 0.240404735785
 17 / 578 The train loss 0.238166394041
 18 / 578 The train loss 0.232135154721
 19 / 578 The train loss 0.224954115325
 20 / 578 The train loss 0.218580621853
 21 / 578 The train loss 0.212597872885
 22 / 578 The train loss 0.206349118528
 23 / 578 The train loss 0.200017811973
 24 / 578 The train loss 0.194751938339
 25 / 578 The train loss 0.190755551904
 26 / 578 The train loss 0.186484931323
 27 / 578 The train loss 0.183370818281
 28 / 578 The train loss 0.182498233792
 29 / 578 The train loss 0.180666026499
 30 / 578 The train loss 0.178627713645
 31 / 578 The train loss 0.177055371505
 32 / 578 The train loss 0.17541861895
 33 / 578 The train loss 0.175474920846
 34 / 578 The train loss 0.175616263357
 35 / 578 The train loss 0.177261739011
 36 / 578 The train loss 0.181307619541
 37 / 578 The train loss 0.184524208207
 38 / 578 The train loss 0.188644113411
 39 / 578 The train loss 0.190066516304
 40 / 578 The train loss 0.189698196109
 41 / 578 The train loss 0.189173764844
 42 / 578 The train loss 0.188662859567
 43 / 578 The train loss 0.189515748603
 44 / 578 The train loss 0.190528667701
 45 / 578 The train loss 0.191897094167
 46 / 578 The train loss 0.191186976093
 47 / 578 The train loss 0.189931861818
 48 / 578 The train loss 0.18873847827
 49 / 578 The train loss 0.187548510198
 50 / 578 The train loss 0.188160543367
 51 / 578 The train loss 0.190352955212
 52 / 578 The train loss 0.192585529545
 53 / 578 The train loss 0.193338162424
 54 / 578 The train loss 0.193497530495
 55 / 578 The train loss 0.193573845178
 56 / 578 The train loss 0.193111288694
 57 / 578 The train loss 0.194523569196
 58 / 578 The train loss 0.197926333688
 59 / 578 The train loss 0.202945572782
 60 / 578 The train loss 0.210021043755
 61 / 578 The train loss 0.220150317264
 62 / 578 The train loss 0.2291065252
 63 / 578 The train loss 0.237412439393
 64 / 578 The train loss 0.243615467974
 65 / 578 The train loss 0.247310299999
 66 / 578 The train loss 0.248960224008
 67 / 578 The train loss 0.25054300493
 68 / 578 The train loss 0.251366400927
 69 / 578 The train loss 0.251738080458
 70 / 578 The train loss 0.254298541322
 71 / 578 The train loss 0.259018193616
 72 / 578 The train loss 0.264551444962
 73 / 578 The train loss 0.269194560457
 74 / 578 The train loss 0.273202489306
 75 / 578 The train loss 0.277395443072
 76 / 578 The train loss 0.28001037941
 77 / 578 The train loss 0.281681462106
 78 / 578 The train loss 0.283592917168
 79 / 578 The train loss 0.286177480098
 80 / 578 The train loss 0.288520865468
 81 / 578 The train loss 0.289858016289
 82 / 578 The train loss 0.289810401592
 83 / 578 The train loss 0.289161226791
 84 / 578 The train loss 0.289335800024
 85 / 578 The train loss 0.289418127212
 86 / 578 The train loss 0.289128422694
 87 / 578 The train loss 0.289002205406
 88 / 578 The train loss 0.289408787365
 89 / 578 The train loss 0.291862684647
 90 / 578 The train loss 0.293371378341
 91 / 578 The train loss 0.291803094835
 92 / 578 The train loss 0.290424425276
 93 / 578 The train loss 0.288656540415
 94 / 578 The train loss 0.286664895436
 95 / 578 The train loss 0.28447191029
 96 / 578 The train loss 0.283162211417
 97 / 578 The train loss 0.281875540032
 98 / 578 The train loss 0.281830633819
 99 / 578 The train loss 0.281638785783
 100 / 578 The train loss 0.282743790187
 101 / 578 The train loss 0.283059781178
 102 / 578 The train loss 0.282459646208
 103 / 578 The train loss 0.281386018125
 104 / 578 The train loss 0.282093625552
 105 / 578 The train loss 0.281536257586
 106 / 578 The train loss 0.282151551214
 107 / 578 The train loss 0.282299039555
 108 / 578 The train loss 0.281969943291
 109 / 578 The train loss 0.282134350868
 110 / 578 The train loss 0.281327969276
 111 / 578 The train loss 0.28246908994
 112 / 578 The train loss 0.282428036743
 113 / 578 The train loss 0.283101863203
 114 / 578 The train loss 0.284844584796
 115 / 578 The train loss 0.287257703653
 116 / 578 The train loss 0.28938136086
 117 / 578 The train loss 0.291825228426
 118 / 578 The train loss 0.295531690279
 119 / 578 The train loss 0.299722706727
 120 / 578 The train loss 0.303502569254
 121 / 578 The train loss 0.307430242365
 122 / 578 The train loss 0.31208047286
 123 / 578 The train loss 0.319736217732
 124 / 578 The train loss 0.329529415846
 125 / 578 The train loss 0.33943394056
 126 / 578 The train loss 0.349327053963
 127 / 578 The train loss 0.35954285856
 128 / 578 The train loss 0.37056486556
 129 / 578 The train loss 0.382232438369
 130 / 578 The train loss 0.394676464022
 131 / 578 The train loss 0.404021704157
 132 / 578 The train loss 0.409212871624
 133 / 578 The train loss 0.411783546125
 134 / 578 The train loss 0.413983563573
 135 / 578 The train loss 0.417718202676
 136 / 578 The train loss 0.422250373097
 137 / 578 The train loss 0.427444964131
 138 / 578 The train loss 0.430065015223
 139 / 578 The train loss 0.432265419031
 140 / 578 The train loss 0.433026832901
 141 / 578 The train loss 0.432689964269
 142 / 578 The train loss 0.433048189027
 143 / 578 The train loss 0.43495894721
 144 / 578 The train loss 0.436902853272
 145 / 578 The train loss 0.438647340007
 146 / 578 The train loss 0.439136222949
 147 / 578 The train loss 0.439235384494
 148 / 578 The train loss 0.43819644355
 149 / 578 The train loss 0.437092828396
 150 / 578 The train loss 0.43666754894
 151 / 578 The train loss 0.434582355722
 152 / 578 The train loss 0.432685307105
 153 / 578 The train loss 0.430708398154
 154 / 578 The train loss 0.429205465747
 155 / 578 The train loss 0.428321703139
 156 / 578 The train loss 0.427328385914
 157 / 578 The train loss 0.427013894863
 158 / 578 The train loss 0.427428427683
 159 / 578 The train loss 0.427086458946
 160 / 578 The train loss 0.426325508137
 161 / 578 The train loss 0.425169632232
 162 / 578 The train loss 0.423830819466
 163 / 578 The train loss 0.423245787186
 164 / 578 The train loss 0.422340370647
 165 / 578 The train loss 0.420755057014
 166 / 578 The train loss 0.41943774581
 167 / 578 The train loss 0.418414811732
 168 / 578 The train loss 0.422630970522
 169 / 578 The train loss 0.432023467948
 170 / 578 The train loss 0.440525356308
 171 / 578 The train loss 0.44788290728
 172 / 578 The train loss 0.453494780827
 173 / 578 The train loss 0.460560606933
 174 / 578 The train loss 0.467441598424
 175 / 578 The train loss 0.474373846586
 176 / 578 The train loss 0.481559067139
 177 / 578 The train loss 0.485517402679
 178 / 578 The train loss 0.488814993374
 179 / 578 The train loss 0.493416012853
 180 / 578 The train loss 0.497026740035
 181 / 578 The train loss 0.499509761532
 182 / 578 The train loss 0.500793411366
 183 / 578 The train loss 0.502440396086
 184 / 578 The train loss 0.507261747434
 185 / 578 The train loss 0.51282304769
 186 / 578 The train loss 0.522293023705
 187 / 578 The train loss 0.536089002947
 188 / 578 The train loss 0.548989909305
 189 / 578 The train loss 0.56033882343
 190 / 578 The train loss 0.575617464769
 191 / 578 The train loss 0.58805124253
 192 / 578 The train loss 0.599481362403
 193 / 578 The train loss 0.609604065456
 194 / 578 The train loss 0.617128235336
 195 / 578 The train loss 0.621699742954
 196 / 578 The train loss 0.626745128681
 197 / 578 The train loss 0.634932737835
 198 / 578 The train loss 0.638970404732
 199 / 578 The train loss 0.643697575877
 200 / 578 The train loss 0.648983875681
 201 / 578 The train loss 0.658584328007
 202 / 578 The train loss 0.67250636089
 203 / 578 The train loss 0.685277359991
 204 / 578 The train loss 0.69446918677
 205 / 578 The train loss 0.701522816845
 206 / 578 The train loss 0.704877590254
 207 / 578 The train loss 0.708145301861
 208 / 578 The train loss 0.71560990368
 209 / 578 The train loss 0.726729917915
 210 / 578 The train loss 0.734966610887
 211 / 578 The train loss 0.741311984198
 212 / 578 The train loss 0.748387823848
 213 / 578 The train loss 0.754964898047
 214 / 578 The train loss 0.75978339124
 215 / 578 The train loss 0.763985357884
 216 / 578 The train loss 0.767010177882
 217 / 578 The train loss 0.76894223266
 218 / 578 The train loss 0.770651343919
 219 / 578 The train loss 0.773193442811
 220 / 578 The train loss 0.776395847191
 221 / 578 The train loss 0.780862102293
 222 / 578 The train loss 0.78505275125
 223 / 578 The train loss 0.78983030206
 224 / 578 The train loss 0.794907123
 225 / 578 The train loss 0.802034885767
 226 / 578 The train loss 0.809563288795
 227 / 578 The train loss 0.816855103056
 228 / 578 The train loss 0.823133803398
 229 / 578 The train loss 0.829679573614
 230 / 578 The train loss 0.836130328946
 231 / 578 The train loss 0.842979885428
 232 / 578 The train loss 0.849099662942
 233 / 578 The train loss 0.85537587127
 234 / 578 The train loss 0.862081866122
 235 / 578 The train loss 0.869979309671
 236 / 578 The train loss 0.875780707841
 237 / 578 The train loss 0.876974163318
 238 / 578 The train loss 0.876910394371
 239 / 578 The train loss 0.87721460124
 240 / 578 The train loss 0.877873441965
 241 / 578 The train loss 0.880153239407
 242 / 578 The train loss 0.882853555177
 243 / 578 The train loss 0.885990793215
 244 / 578 The train loss 0.889688344168
 245 / 578 The train loss 0.894631502473
 246 / 578 The train loss 0.899599296185
 247 / 578 The train loss 0.901615134995
 248 / 578 The train loss 0.902999828314
 249 / 578 The train loss 0.904463979988
 250 / 578 The train loss 0.90846284999
 251 / 578 The train loss 0.912889284338
 252 / 578 The train loss 0.917504822821
 253 / 578 The train loss 0.923586684621
 254 / 578 The train loss 0.932019064174
 255 / 578 The train loss 0.94288483544
 256 / 578 The train loss 0.954596680953
 257 / 578 The train loss 0.965953659353
 258 / 578 The train loss 0.974955101707
 259 / 578 The train loss 0.979863125745
 260 / 578 The train loss 0.984746284109
 261 / 578 The train loss 0.989621294313
 262 / 578 The train loss 0.993342265728
 263 / 578 The train loss 0.99786090025
 264 / 578 The train loss 1.00391233861
 265 / 578 The train loss 1.00890652179
 266 / 578 The train loss 1.01490610514
 267 / 578 The train loss 1.02097403234
 268 / 578 The train loss 1.02922911154
 269 / 578 The train loss 1.03761806266
 270 / 578 The train loss 1.04790735853
 271 / 578 The train loss 1.05608421049
 272 / 578 The train loss 1.0635739003
 273 / 578 The train loss 1.07258208234
 274 / 578 The train loss 1.08079021159
 275 / 578 The train loss 1.08750010047
 276 / 578 The train loss 1.09449603583
 277 / 578 The train loss 1.09941025236
 278 / 578 The train loss 1.10544685142
 279 / 578 The train loss 1.11144496966
 280 / 578 The train loss 1.119591024
 281 / 578 The train loss 1.12715879529
 282 / 578 The train loss 1.1340138125
 283 / 578 The train loss 1.14055399077
 284 / 578 The train loss 1.14629233378
 285 / 578 The train loss 1.15126987498
 286 / 578 The train loss 1.15636934841
 287 / 578 The train loss 1.16203751123
 288 / 578 The train loss 1.16375794211
 289 / 578 The train loss 1.16811647084
 290 / 578 The train loss 1.17476661484
 291 / 578 The train loss 1.18084425893
 292 / 578 The train loss 1.187385045
 293 / 578 The train loss 1.19384720577
 294 / 578 The train loss 1.19911402455
 295 / 578 The train loss 1.20641432083
 296 / 578 The train loss 1.21038349689
 297 / 578 The train loss 1.2137238565
 298 / 578 The train loss 1.21612831961
 299 / 578 The train loss 1.22244613297
 300 / 578 The train loss 1.22865731469
 301 / 578 The train loss 1.2314282041
 302 / 578 The train loss 1.2344944871
 303 / 578 The train loss 1.23860490825
 304 / 578 The train loss 1.24189584576
 305 / 578 The train loss 1.24318238
 306 / 578 The train loss 1.24494926438
 307 / 578 The train loss 1.24811829875
 308 / 578 The train loss 1.25066973706
 309 / 578 The train loss 1.25226667814
 310 / 578 The train loss 1.25382169515
 311 / 578 The train loss 1.25336696396
 312 / 578 The train loss 1.25354699362
 313 / 578 The train loss 1.2531006503
 314 / 578 The train loss 1.25285373477
 315 / 578 The train loss 1.25140485793
 316 / 578 The train loss 1.25201446602
 317 / 578 The train loss 1.25147678152
 318 / 578 The train loss 1.25080567703
 319 / 578 The train loss 1.24930462747
 320 / 578 The train loss 1.24775640565
 321 / 578 The train loss 1.24580023121
 322 / 578 The train loss 1.24292586328
 323 / 578 The train loss 1.24055277657
 324 / 578 The train loss 1.23783521282
 325 / 578 The train loss 1.23528209027
 326 / 578 The train loss 1.2321357995
 327 / 578 The train loss 1.22906685665
 328 / 578 The train loss 1.22608702227
 329 / 578 The train loss 1.22358439502
 330 / 578 The train loss 1.22049611763
 331 / 578 The train loss 1.21771184538
 332 / 578 The train loss 1.21477235796
 333 / 578 The train loss 1.21198366727
 334 / 578 The train loss 1.20969062014
 335 / 578 The train loss 1.20869395952
 336 / 578 The train loss 1.20870476293
 337 / 578 The train loss 1.20901575752
 338 / 578 The train loss 1.2084077696
 339 / 578 The train loss 1.20673595546
 340 / 578 The train loss 1.20451949761
 341 / 578 The train loss 1.20222146163
 342 / 578 The train loss 1.19953774244
 343 / 578 The train loss 1.197387169
 344 / 578 The train loss 1.19536920174
 345 / 578 The train loss 1.19345562377
 346 / 578 The train loss 1.19144878667
 347 / 578 The train loss 1.19020442537
 348 / 578 The train loss 1.18951940961
 349 / 578 The train loss 1.18912393945
 350 / 578 The train loss 1.18866390446
 351 / 578 The train loss 1.18806608566
 352 / 578 The train loss 1.18755140982
 353 / 578 The train loss 1.18675169219
 354 / 578 The train loss 1.18576518304
 355 / 578 The train loss 1.18540458626
 356 / 578 The train loss 1.18545615586
 357 / 578 The train loss 1.18520304115
 358 / 578 The train loss 1.18481829093
 359 / 578 The train loss 1.18407912089
 360 / 578 The train loss 1.18353027715
 361 / 578 The train loss 1.18297745445
 362 / 578 The train loss 1.18191196145
 363 / 578 The train loss 1.18059249732
 364 / 578 The train loss 1.17893037
 365 / 578 The train loss 1.17724369467
 366 / 578 The train loss 1.17428417831
 367 / 578 The train loss 1.1712766987
 368 / 578 The train loss 1.16830275023
 369 / 578 The train loss 1.16531745297
 370 / 578 The train loss 1.16244394847
 371 / 578 The train loss 1.15950130488
 372 / 578 The train loss 1.15656645154
 373 / 578 The train loss 1.15370800125
 374 / 578 The train loss 1.15074882388
 375 / 578 The train loss 1.14792893965
 376 / 578 The train loss 1.1450617385
 377 / 578 The train loss 1.14220512588
 378 / 578 The train loss 1.13934254801
 379 / 578 The train loss 1.13663499621
 380 / 578 The train loss 1.13392936665
 381 / 578 The train loss 1.13137629369
 382 / 578 The train loss 1.12910984838
 383 / 578 The train loss 1.12687131302
 384 / 578 The train loss 1.12443930267
 385 / 578 The train loss 1.12195210077
 386 / 578 The train loss 1.11929320508
 387 / 578 The train loss 1.11670331436
 388 / 578 The train loss 1.11419656956
 389 / 578 The train loss 1.11174984815
 390 / 578 The train loss 1.10923835562
 391 / 578 The train loss 1.10663435529
 392 / 578 The train loss 1.10407097371
 393 / 578 The train loss 1.10154352937
 394 / 578 The train loss 1.09893849574
 395 / 578 The train loss 1.09631820407
 396 / 578 The train loss 1.09405910091
 397 / 578 The train loss 1.09204885757
 398 / 578 The train loss 1.09018500145
 399 / 578 The train loss 1.08834701167
 400 / 578 The train loss 1.08651458043
 401 / 578 The train loss 1.08471399742
 402 / 578 The train loss 1.0831017903
 403 / 578 The train loss 1.08162904779
 404 / 578 The train loss 1.08028722919
 405 / 578 The train loss 1.07916365938
 406 / 578 The train loss 1.0779782876
 407 / 578 The train loss 1.07676772143
 408 / 578 The train loss 1.07537290106
 409 / 578 The train loss 1.07639127258
 410 / 578 The train loss 1.07920584681
 411 / 578 The train loss 1.08167799143
 412 / 578 The train loss 1.08240599629
 413 / 578 The train loss 1.08285089904
 414 / 578 The train loss 1.08418799745
 415 / 578 The train loss 1.08519994468
 416 / 578 The train loss 1.08495744464
 417 / 578 The train loss 1.08429952279
 418 / 578 The train loss 1.08383945688
 419 / 578 The train loss 1.08363551871
 420 / 578 The train loss 1.08370440335
 421 / 578 The train loss 1.08353117316
 422 / 578 The train loss 1.08340948159
 423 / 578 The train loss 1.08342780362
 424 / 578 The train loss 1.08345440047
 425 / 578 The train loss 1.08309531158
 426 / 578 The train loss 1.08242588706
 427 / 578 The train loss 1.08134772853
 428 / 578 The train loss 1.08004534372
 429 / 578 The train loss 1.07852387917
 430 / 578 The train loss 1.07666384303
 431 / 578 The train loss 1.07465132859
 432 / 578 The train loss 1.07271542479
 433 / 578 The train loss 1.0705208274
 434 / 578 The train loss 1.0682158273
 435 / 578 The train loss 1.06594396236
 436 / 578 The train loss 1.06366810605
 437 / 578 The train loss 1.06171831062
 438 / 578 The train loss 1.06023736015
 439 / 578 The train loss 1.05895260135
 440 / 578 The train loss 1.05754527242
 441 / 578 The train loss 1.05605243416
 442 / 578 The train loss 1.05452791513
 443 / 578 The train loss 1.05327022798
 444 / 578 The train loss 1.05282436133
 445 / 578 The train loss 1.05274643733
 446 / 578 The train loss 1.05339519562
 447 / 578 The train loss 1.05389272256
 448 / 578 The train loss 1.05408902755
 449 / 578 The train loss 1.05399791696
 450 / 578 The train loss 1.05364797098
 451 / 578 The train loss 1.05331491389
 452 / 578 The train loss 1.05288622438
 453 / 578 The train loss 1.05256277477
 454 / 578 The train loss 1.0522379603
 455 / 578 The train loss 1.05173345905
 456 / 578 The train loss 1.05103784978
 457 / 578 The train loss 1.0503618499
 458 / 578 The train loss 1.049491989
 459 / 578 The train loss 1.04873845023
 460 / 578 The train loss 1.04827719029
 461 / 578 The train loss 1.04756830903
 462 / 578 The train loss 1.04665108027
 463 / 578 The train loss 1.04547644005
 464 / 578 The train loss 1.04363276182
 465 / 578 The train loss 1.0417930286
 466 / 578 The train loss 1.03984053413
 467 / 578 The train loss 1.03793260199
 468 / 578 The train loss 1.03605065207
 469 / 578 The train loss 1.03428802109
 470 / 578 The train loss 1.03267478511
 471 / 578 The train loss 1.03115570829
 472 / 578 The train loss 1.02964330066
 473 / 578 The train loss 1.02812673834
 474 / 578 The train loss 1.02643466184
 475 / 578 The train loss 1.02483001746
 476 / 578 The train loss 1.02322915803
 477 / 578 The train loss 1.02163807444
 478 / 578 The train loss 1.01979048687
 479 / 578 The train loss 1.01791541598
 480 / 578 The train loss 1.01599302187
 481 / 578 The train loss 1.01411876192
 482 / 578 The train loss 1.01227686645
 483 / 578 The train loss 1.01033906056
 484 / 578 The train loss 1.00850689818
 485 / 578 The train loss 1.00674899485
 486 / 578 The train loss 1.00497971175
 487 / 578 The train loss 1.00322534484
 488 / 578 The train loss 1.00149334006
 489 / 578 The train loss 0.999660111469
 490 / 578 The train loss 0.997848596264
 491 / 578 The train loss 0.996131700686
 492 / 578 The train loss 0.99444733473
 493 / 578 The train loss 0.992776160648
 494 / 578 The train loss 0.991111688685
 495 / 578 The train loss 0.989499506302
 496 / 578 The train loss 0.988208363832
 497 / 578 The train loss 0.987076818501
 498 / 578 The train loss 0.985867385801
 499 / 578 The train loss 0.984570942603
 500 / 578 The train loss 0.983163507141
 501 / 578 The train loss 0.981575428941
 502 / 578 The train loss 0.98002152297
 503 / 578 The train loss 0.978608573444
 504 / 578 The train loss 0.977236917132
 505 / 578 The train loss 0.97585499851
 506 / 578 The train loss 0.974422150015
 507 / 578 The train loss 0.973040729151
 508 / 578 The train loss 0.971459148298
 509 / 578 The train loss 0.969831796176
 510 / 578 The train loss 0.968122155913
 511 / 578 The train loss 0.966493793612
 512 / 578 The train loss 0.964803461167
 513 / 578 The train loss 0.963071356282
 514 / 578 The train loss 0.961446031964
 515 / 578 The train loss 0.959841334364
 516 / 578 The train loss 0.958336808719
 517 / 578 The train loss 0.956902591162
 518 / 578 The train loss 0.955396422296
 519 / 578 The train loss 0.953856428902
 520 / 578 The train loss 0.952443784196
 521 / 578 The train loss 0.951016005944
 522 / 578 The train loss 0.949354096598
 523 / 578 The train loss 0.947858928359
 524 / 578 The train loss 0.946563517696
 525 / 578 The train loss 0.945419923046
 526 / 578 The train loss 0.944221482793
 527 / 578 The train loss 0.943043761033
 528 / 578 The train loss 0.941959189399
 529 / 578 The train loss 0.94092326988
 530 / 578 The train loss 0.939856985809
 531 / 578 The train loss 0.9387009028
 532 / 578 The train loss 0.937528599341
 533 / 578 The train loss 0.936446215636
 534 / 578 The train loss 0.935097118644
 535 / 578 The train loss 0.933657364039
 536 / 578 The train loss 0.932278288063
 537 / 578 The train loss 0.930931127112
 538 / 578 The train loss 0.929693775778
 539 / 578 The train loss 0.928385246531
 540 / 578 The train loss 0.927127370383
 541 / 578 The train loss 0.926030053098
 542 / 578 The train loss 0.924888370036
 543 / 578 The train loss 0.923674679572
 544 / 578 The train loss 0.922592499163
 545 / 578 The train loss 0.921863782003
 546 / 578 The train loss 0.922291823721
 547 / 578 The train loss 0.923032868204
 548 / 578 The train loss 0.92536913384
 549 / 578 The train loss 0.926495817766
 550 / 578 The train loss 0.926963922469
 551 / 578 The train loss 0.927139250781
 552 / 578 The train loss 0.927122660123
 553 / 578 The train loss 0.92721796729
 554 / 578 The train loss 0.92725023211
 555 / 578 The train loss 0.927007694214
 556 / 578 The train loss 0.926790242177
 557 / 578 The train loss 0.926559074703
 558 / 578 The train loss 0.926387544314
 559 / 578 The train loss 0.926310535014
 560 / 578 The train loss 0.92629376898
 561 / 578 The train loss 0.926219143153
 562 / 578 The train loss 0.92617868585
 563 / 578 The train loss 0.925966021573
 564 / 578 The train loss 0.92480639155
 565 / 578 The train loss 0.923666107213
 566 / 578 The train loss 0.922544666905
 567 / 578 The train loss 0.921370283249
 568 / 578 The train loss 0.920167003148
 569 / 578 The train loss 0.918965144721
 570 / 578 The train loss 0.917763255256
 571 / 578 The train loss 0.916571087033
 572 / 578 The train loss 0.915604348763
 573 / 578 The train loss 0.915645065713
 574 / 578 The train loss 0.915371119645
 575 / 578 The train loss 0.914657374756
 576 / 578 The train loss 0.914263237455
 577 / 578 The train loss 0.91368963481
 578 / 578 The train loss 0.91314507725

Starting epoch 6
Validation:
 1 / 30 The valid loss 0.459952056408
 2 / 30 The valid loss 0.434771224856
 3 / 30 The valid loss 0.413475165764
 4 / 30 The valid loss 0.392620511353
 5 / 30 The valid loss 0.374136942625
 6 / 30 The valid loss 0.414793416858
 7 / 30 The valid loss 0.529141379254
 8 / 30 The valid loss 0.674032632262
 9 / 30 The valid loss 0.821537842353
 10 / 30 The valid loss 0.901765134931
 11 / 30 The valid loss 0.920721208507
 12 / 30 The valid loss 0.932454191148
 13 / 30 The valid loss 0.941873791126
 14 / 30 The valid loss 0.947054132819
 15 / 30 The valid loss 0.930054499706
 16 / 30 The valid loss 0.922820614651
 17 / 30 The valid loss 0.936101964291
 18 / 30 The valid loss 0.946072871486
 19 / 30 The valid loss 0.946592388969
 20 / 30 The valid loss 0.940458412468
 21 / 30 The valid loss 0.926593634344
 22 / 30 The valid loss 0.913664613258
 23 / 30 The valid loss 0.901916233094
 24 / 30 The valid loss 0.893182296306
 25 / 30 The valid loss 0.885033475161
 26 / 30 The valid loss 0.881569006122
 27 / 30 The valid loss 0.870916983596
 28 / 30 The valid loss 0.851460305708
 29 / 30 The valid loss 0.829148044874
 30 / 30 The valid loss 0.81040789187

Training
 1 / 578 The train loss 0.626258134842
 2 / 578 The train loss 0.469520762563
 3 / 578 The train loss 0.417056510846
 4 / 578 The train loss 0.400502204895
 5 / 578 The train loss 0.391342496872
 6 / 578 The train loss 0.364060387015
 7 / 578 The train loss 0.335100099444
 8 / 578 The train loss 0.305032305419
 9 / 578 The train loss 0.280792375406
 10 / 578 The train loss 0.26172843948
 11 / 578 The train loss 0.249848959798
 12 / 578 The train loss 0.23897072052
 13 / 578 The train loss 0.235132181874
 14 / 578 The train loss 0.241454255368
 15 / 578 The train loss 0.241284985344
 16 / 578 The train loss 0.24050160218
 17 / 578 The train loss 0.237975292346
 18 / 578 The train loss 0.232441717552
 19 / 578 The train loss 0.225546866655
 20 / 578 The train loss 0.217566535249
 21 / 578 The train loss 0.211292358736
 22 / 578 The train loss 0.205454060638
 23 / 578 The train loss 0.19977289503
 24 / 578 The train loss 0.194803595232
 25 / 578 The train loss 0.189755877554
 26 / 578 The train loss 0.185417285332
 27 / 578 The train loss 0.182329990522
 28 / 578 The train loss 0.180298913004
 29 / 578 The train loss 0.178197857378
 30 / 578 The train loss 0.176303203901
 31 / 578 The train loss 0.174394806306
 32 / 578 The train loss 0.172116106143
 33 / 578 The train loss 0.171303769178
 34 / 578 The train loss 0.171873682781
 35 / 578 The train loss 0.174616888804
 36 / 578 The train loss 0.179937825642
 37 / 578 The train loss 0.183072878098
 38 / 578 The train loss 0.186232080781
 39 / 578 The train loss 0.188586738056
 40 / 578 The train loss 0.189050966688
 41 / 578 The train loss 0.188130311668
 42 / 578 The train loss 0.188054444357
 43 / 578 The train loss 0.189129201305
 44 / 578 The train loss 0.190273269164
 45 / 578 The train loss 0.191504500144
 46 / 578 The train loss 0.190703652313
 47 / 578 The train loss 0.189939449284
 48 / 578 The train loss 0.1892057308
 49 / 578 The train loss 0.188260815429
 50 / 578 The train loss 0.189859092087
 51 / 578 The train loss 0.191855368515
 52 / 578 The train loss 0.194433761187
 53 / 578 The train loss 0.194730689644
 54 / 578 The train loss 0.194298796218
 55 / 578 The train loss 0.193438162722
 56 / 578 The train loss 0.193081024103
 57 / 578 The train loss 0.193620159046
 58 / 578 The train loss 0.196102046889
 59 / 578 The train loss 0.200393029315
 60 / 578 The train loss 0.208172313496
 61 / 578 The train loss 0.218138281561
 62 / 578 The train loss 0.227183920842
 63 / 578 The train loss 0.235106911096
 64 / 578 The train loss 0.240876827505
 65 / 578 The train loss 0.244041193105
 66 / 578 The train loss 0.246015141069
 67 / 578 The train loss 0.2468233292
 68 / 578 The train loss 0.247659648714
 69 / 578 The train loss 0.248882072235
 70 / 578 The train loss 0.251957239956
 71 / 578 The train loss 0.257630234558
 72 / 578 The train loss 0.262513785416
 73 / 578 The train loss 0.266398730037
 74 / 578 The train loss 0.270061275141
 75 / 578 The train loss 0.273765782217
 76 / 578 The train loss 0.275541133492
 77 / 578 The train loss 0.276649526671
 78 / 578 The train loss 0.278269960808
 79 / 578 The train loss 0.280348683554
 80 / 578 The train loss 0.281295906845
 81 / 578 The train loss 0.281815891005
 82 / 578 The train loss 0.281508429112
 83 / 578 The train loss 0.280780241881
 84 / 578 The train loss 0.280491194289
 85 / 578 The train loss 0.279900973597
 86 / 578 The train loss 0.27903565236
 87 / 578 The train loss 0.278234420066
 88 / 578 The train loss 0.277801375582
 89 / 578 The train loss 0.279567214294
 90 / 578 The train loss 0.279884049379
 91 / 578 The train loss 0.278079998035
 92 / 578 The train loss 0.276357650595
 93 / 578 The train loss 0.274837410578
 94 / 578 The train loss 0.273846844092
 95 / 578 The train loss 0.27220859559
 96 / 578 The train loss 0.270765156796
 97 / 578 The train loss 0.269129866024
 98 / 578 The train loss 0.26861349843
 99 / 578 The train loss 0.268609931072
 100 / 578 The train loss 0.26743417576
 101 / 578 The train loss 0.267487018563
 102 / 578 The train loss 0.267717247354
 103 / 578 The train loss 0.267777486768
 104 / 578 The train loss 0.267229186371
 105 / 578 The train loss 0.266487713939
 106 / 578 The train loss 0.266726985574
 107 / 578 The train loss 0.266430011698
 108 / 578 The train loss 0.266756931389
 109 / 578 The train loss 0.267737312054
 110 / 578 The train loss 0.266730102626
 111 / 578 The train loss 0.266335605098
 112 / 578 The train loss 0.266939818194
 113 / 578 The train loss 0.268134913481
 114 / 578 The train loss 0.268964647855
 115 / 578 The train loss 0.269793399132
 116 / 578 The train loss 0.271318275995
 117 / 578 The train loss 0.274420666516
 118 / 578 The train loss 0.277484621663
 119 / 578 The train loss 0.281675567772
 120 / 578 The train loss 0.286401254063
 121 / 578 The train loss 0.290937251419
 122 / 578 The train loss 0.295066399286
 123 / 578 The train loss 0.301269595822
 124 / 578 The train loss 0.311118993668
 125 / 578 The train loss 0.321526767373
 126 / 578 The train loss 0.330510113566
 127 / 578 The train loss 0.339749211988
 128 / 578 The train loss 0.350221931585
 129 / 578 The train loss 0.361592473563
 130 / 578 The train loss 0.373508133911
 131 / 578 The train loss 0.383364157481
 132 / 578 The train loss 0.388415358848
 133 / 578 The train loss 0.390655621214
 134 / 578 The train loss 0.392294957678
 135 / 578 The train loss 0.395341619849
 136 / 578 The train loss 0.399704518673
 137 / 578 The train loss 0.404818758586
 138 / 578 The train loss 0.40770730603
 139 / 578 The train loss 0.409587182265
 140 / 578 The train loss 0.409963371392
 141 / 578 The train loss 0.409952374321
 142 / 578 The train loss 0.410406488033
 143 / 578 The train loss 0.412653651375
 144 / 578 The train loss 0.414519245529
 145 / 578 The train loss 0.415755140884
 146 / 578 The train loss 0.416668889559
 147 / 578 The train loss 0.416685256524
 148 / 578 The train loss 0.415793278632
 149 / 578 The train loss 0.415002511152
 150 / 578 The train loss 0.414653503199
 151 / 578 The train loss 0.412760277852
 152 / 578 The train loss 0.41109188499
 153 / 578 The train loss 0.409352030045
 154 / 578 The train loss 0.407863284473
 155 / 578 The train loss 0.407207848564
 156 / 578 The train loss 0.40641749784
 157 / 578 The train loss 0.405932205308
 158 / 578 The train loss 0.405717311969
 159 / 578 The train loss 0.405378154051
 160 / 578 The train loss 0.405140279047
 161 / 578 The train loss 0.404039347764
 162 / 578 The train loss 0.40280511267
 163 / 578 The train loss 0.402231153809
 164 / 578 The train loss 0.401276499578
 165 / 578 The train loss 0.400046837691
 166 / 578 The train loss 0.399284768535
 167 / 578 The train loss 0.398322839312
 168 / 578 The train loss 0.402088097252
 169 / 578 The train loss 0.409080207083
 170 / 578 The train loss 0.418147975645
 171 / 578 The train loss 0.426038029685
 172 / 578 The train loss 0.432021375399
 173 / 578 The train loss 0.439495753088
 174 / 578 The train loss 0.446543577844
 175 / 578 The train loss 0.453743089352
 176 / 578 The train loss 0.460645410232
 177 / 578 The train loss 0.464722543197
 178 / 578 The train loss 0.467978336922
 179 / 578 The train loss 0.471368353543
 180 / 578 The train loss 0.474435893777
 181 / 578 The train loss 0.476035121808
 182 / 578 The train loss 0.477150754369
 183 / 578 The train loss 0.478471988293
 184 / 578 The train loss 0.482424078185
 185 / 578 The train loss 0.487273587166
 186 / 578 The train loss 0.496840115276
 187 / 578 The train loss 0.510053575756
 188 / 578 The train loss 0.522179329411
 189 / 578 The train loss 0.5332194167
 190 / 578 The train loss 0.548266572466
 191 / 578 The train loss 0.560599230737
 192 / 578 The train loss 0.570863040552
 193 / 578 The train loss 0.579550814521
 194 / 578 The train loss 0.585925786535
 195 / 578 The train loss 0.590357146278
 196 / 578 The train loss 0.59521789309
 197 / 578 The train loss 0.603144983123
 198 / 578 The train loss 0.608079020197
 199 / 578 The train loss 0.613664462114
 200 / 578 The train loss 0.619491140023
 201 / 578 The train loss 0.629357362046
 202 / 578 The train loss 0.643611476342
 203 / 578 The train loss 0.656529606637
 204 / 578 The train loss 0.666514713755
 205 / 578 The train loss 0.673831521374
 206 / 578 The train loss 0.677141942229
 207 / 578 The train loss 0.680632187908
 208 / 578 The train loss 0.688985941287
 209 / 578 The train loss 0.699917076664
 210 / 578 The train loss 0.707556602856
 211 / 578 The train loss 0.714978116172
 212 / 578 The train loss 0.722497121132
 213 / 578 The train loss 0.729479351864
 214 / 578 The train loss 0.734869399709
 215 / 578 The train loss 0.73911262813
 216 / 578 The train loss 0.742445988029
 217 / 578 The train loss 0.744476401105
 218 / 578 The train loss 0.745947302823
 219 / 578 The train loss 0.748742615047
 220 / 578 The train loss 0.752305953408
 221 / 578 The train loss 0.756283698915
 222 / 578 The train loss 0.760048026236
 223 / 578 The train loss 0.764167091037
 224 / 578 The train loss 0.768853510397
 225 / 578 The train loss 0.775882431997
 226 / 578 The train loss 0.783653104529
 227 / 578 The train loss 0.791183712527
 228 / 578 The train loss 0.797559074469
 229 / 578 The train loss 0.804443729301
 230 / 578 The train loss 0.810645569213
 231 / 578 The train loss 0.817466986966
 232 / 578 The train loss 0.823251690651
 233 / 578 The train loss 0.82949677794
 234 / 578 The train loss 0.835849558505
 235 / 578 The train loss 0.844566896431
 236 / 578 The train loss 0.850402769712
 237 / 578 The train loss 0.85213080671
 238 / 578 The train loss 0.852138659095
 239 / 578 The train loss 0.852494107442
 240 / 578 The train loss 0.853393411947
 241 / 578 The train loss 0.855683421507
 242 / 578 The train loss 0.858967236066
 243 / 578 The train loss 0.862039527591
 244 / 578 The train loss 0.866234537336
 245 / 578 The train loss 0.871899873687
 246 / 578 The train loss 0.877069572426
 247 / 578 The train loss 0.879010814706
 248 / 578 The train loss 0.880025933166
 249 / 578 The train loss 0.881962255601
 250 / 578 The train loss 0.885544009507
 251 / 578 The train loss 0.889997037105
 252 / 578 The train loss 0.894745759843
 253 / 578 The train loss 0.901529010871
 254 / 578 The train loss 0.911021716306
 255 / 578 The train loss 0.922118551883
 256 / 578 The train loss 0.934409862675
 257 / 578 The train loss 0.94431962699
 258 / 578 The train loss 0.953123939419
 259 / 578 The train loss 0.957908379353
 260 / 578 The train loss 0.9630701719
 261 / 578 The train loss 0.96770321849
 262 / 578 The train loss 0.971195155257
 263 / 578 The train loss 0.975661049154
 264 / 578 The train loss 0.980106482734
 265 / 578 The train loss 0.985355922242
 266 / 578 The train loss 0.989567464288
 267 / 578 The train loss 0.994119546666
 268 / 578 The train loss 1.00214390062
 269 / 578 The train loss 1.01025119308
 270 / 578 The train loss 1.01890777642
 271 / 578 The train loss 1.02653645045
 272 / 578 The train loss 1.03272844309
 273 / 578 The train loss 1.0381694179
 274 / 578 The train loss 1.04191952275
 275 / 578 The train loss 1.04795377542
 276 / 578 The train loss 1.0539328558
 277 / 578 The train loss 1.05984201656
 278 / 578 The train loss 1.06413089613
 279 / 578 The train loss 1.06799290839
 280 / 578 The train loss 1.07521876166
 281 / 578 The train loss 1.08078816591
 282 / 578 The train loss 1.08444678768
 283 / 578 The train loss 1.08891721818
 284 / 578 The train loss 1.09300113044
 285 / 578 The train loss 1.09678135488
 286 / 578 The train loss 1.1014388523
 287 / 578 The train loss 1.1060225771
 288 / 578 The train loss 1.11135709436
 289 / 578 The train loss 1.11371115525
 290 / 578 The train loss 1.11782922277
 291 / 578 The train loss 1.12179887034
 292 / 578 The train loss 1.12580084336
 293 / 578 The train loss 1.12722115453
 294 / 578 The train loss 1.13140665415
 295 / 578 The train loss 1.13411708833
 296 / 578 The train loss 1.13848575935
 297 / 578 The train loss 1.14188662508
 298 / 578 The train loss 1.14357030857
 299 / 578 The train loss 1.14569755686
 300 / 578 The train loss 1.14933136448
 301 / 578 The train loss 1.15072991825
 302 / 578 The train loss 1.15419754351
 303 / 578 The train loss 1.15667274629
 304 / 578 The train loss 1.15885392091
 305 / 578 The train loss 1.16005375371
 306 / 578 The train loss 1.16211431697
 307 / 578 The train loss 1.16368582723
 308 / 578 The train loss 1.16543680957
 309 / 578 The train loss 1.16651215508
 310 / 578 The train loss 1.16714355816
 311 / 578 The train loss 1.16818321312
 312 / 578 The train loss 1.16873756419
 313 / 578 The train loss 1.16890989539
 314 / 578 The train loss 1.17072394039
 315 / 578 The train loss 1.17117193374
 316 / 578 The train loss 1.17120111993
 317 / 578 The train loss 1.17039444982
 318 / 578 The train loss 1.1712668949
 319 / 578 The train loss 1.17115553032
 320 / 578 The train loss 1.17075355747
 321 / 578 The train loss 1.16939661831
 322 / 578 The train loss 1.16728804779
 323 / 578 The train loss 1.16520236922
 324 / 578 The train loss 1.16291201625
 325 / 578 The train loss 1.16073999052
 326 / 578 The train loss 1.15837373191
 327 / 578 The train loss 1.15598972023
 328 / 578 The train loss 1.15321450258
 329 / 578 The train loss 1.1506859407
 330 / 578 The train loss 1.14784266443
 331 / 578 The train loss 1.14499884686
 332 / 578 The train loss 1.14216040586
 333 / 578 The train loss 1.13945774058
 334 / 578 The train loss 1.13738098027
 335 / 578 The train loss 1.13669693363
 336 / 578 The train loss 1.13684158382
 337 / 578 The train loss 1.13767265318
 338 / 578 The train loss 1.13717579524
 339 / 578 The train loss 1.13562874826
 340 / 578 The train loss 1.1331803356
 341 / 578 The train loss 1.13098871769
 342 / 578 The train loss 1.12846548193
 343 / 578 The train loss 1.1264920816
 344 / 578 The train loss 1.12454630461
 345 / 578 The train loss 1.12279991231
 346 / 578 The train loss 1.1211200552
 347 / 578 The train loss 1.11975783702
 348 / 578 The train loss 1.11895000412
 349 / 578 The train loss 1.11866756932
 350 / 578 The train loss 1.11835445157
 351 / 578 The train loss 1.11779547052
 352 / 578 The train loss 1.11747490115
 353 / 578 The train loss 1.11694029683
 354 / 578 The train loss 1.1162335136
 355 / 578 The train loss 1.11609059228
 356 / 578 The train loss 1.11612202888
 357 / 578 The train loss 1.1162654022
 358 / 578 The train loss 1.11601660946
 359 / 578 The train loss 1.1155539287
 360 / 578 The train loss 1.114942361
 361 / 578 The train loss 1.11425416663
 362 / 578 The train loss 1.11349526468
 363 / 578 The train loss 1.11230593747
 364 / 578 The train loss 1.11087613635
 365 / 578 The train loss 1.10960286292
 366 / 578 The train loss 1.10681066654
 367 / 578 The train loss 1.10398172916
 368 / 578 The train loss 1.10119483442
 369 / 578 The train loss 1.09849651479
 370 / 578 The train loss 1.09569512177
 371 / 578 The train loss 1.09293303307
 372 / 578 The train loss 1.09022313741
 373 / 578 The train loss 1.08746018343
 374 / 578 The train loss 1.08474196149
 375 / 578 The train loss 1.08205149194
 376 / 578 The train loss 1.07935818921
 377 / 578 The train loss 1.07668537818
 378 / 578 The train loss 1.07396881966
 379 / 578 The train loss 1.07132465911
 380 / 578 The train loss 1.06876405513
 381 / 578 The train loss 1.06647755127
 382 / 578 The train loss 1.06433188709
 383 / 578 The train loss 1.06214759287
 384 / 578 The train loss 1.05988134417
 385 / 578 The train loss 1.05747652201
 386 / 578 The train loss 1.05494947168
 387 / 578 The train loss 1.05246034168
 388 / 578 The train loss 1.05006798484
 389 / 578 The train loss 1.0478431096
 390 / 578 The train loss 1.04543387059
 391 / 578 The train loss 1.04296153107
 392 / 578 The train loss 1.04054705266
 393 / 578 The train loss 1.03814188563
 394 / 578 The train loss 1.03569907033
 395 / 578 The train loss 1.03327586547
 396 / 578 The train loss 1.03106585151
 397 / 578 The train loss 1.02927340817
 398 / 578 The train loss 1.02760832234
 399 / 578 The train loss 1.02590458089
 400 / 578 The train loss 1.0241279914
 401 / 578 The train loss 1.02248311494
 402 / 578 The train loss 1.02103905934
 403 / 578 The train loss 1.01987504632
 404 / 578 The train loss 1.01880574505
 405 / 578 The train loss 1.01764666561
 406 / 578 The train loss 1.0164965024
 407 / 578 The train loss 1.01530629175
 408 / 578 The train loss 1.01404401001
 409 / 578 The train loss 1.01506874131
 410 / 578 The train loss 1.01807729077
 411 / 578 The train loss 1.0200835311
 412 / 578 The train loss 1.02064235536
 413 / 578 The train loss 1.02116851554
 414 / 578 The train loss 1.02257534802
 415 / 578 The train loss 1.02377907467
 416 / 578 The train loss 1.02363798737
 417 / 578 The train loss 1.02307484631
 418 / 578 The train loss 1.02251503312
 419 / 578 The train loss 1.02222650829
 420 / 578 The train loss 1.0220807301
 421 / 578 The train loss 1.02180820872
 422 / 578 The train loss 1.02164363492
 423 / 578 The train loss 1.02182694077
 424 / 578 The train loss 1.02173966955
 425 / 578 The train loss 1.02139416875
 426 / 578 The train loss 1.02070902181
 427 / 578 The train loss 1.0197775876
 428 / 578 The train loss 1.01850411782
 429 / 578 The train loss 1.01706780947
 430 / 578 The train loss 1.01521692472
 431 / 578 The train loss 1.01334207187
 432 / 578 The train loss 1.01141691748
 433 / 578 The train loss 1.00929763087
 434 / 578 The train loss 1.00712946792
 435 / 578 The train loss 1.00495595852
 436 / 578 The train loss 1.00275682491
 437 / 578 The train loss 1.00083997158
 438 / 578 The train loss 0.999337946588
 439 / 578 The train loss 0.998074857828
 440 / 578 The train loss 0.996712285831
 441 / 578 The train loss 0.995176069058
 442 / 578 The train loss 0.993612558063
 443 / 578 The train loss 0.99246000861
 444 / 578 The train loss 0.991908858161
 445 / 578 The train loss 0.991682629848
 446 / 578 The train loss 0.992062653603
 447 / 578 The train loss 0.992307328213
 448 / 578 The train loss 0.992363576586
 449 / 578 The train loss 0.992228973491
 450 / 578 The train loss 0.991805375359
 451 / 578 The train loss 0.991360899557
 452 / 578 The train loss 0.99098056971
 453 / 578 The train loss 0.990569032523
 454 / 578 The train loss 0.990098626414
 455 / 578 The train loss 0.989425018667
 456 / 578 The train loss 0.98871640803
 457 / 578 The train loss 0.988056181109
 458 / 578 The train loss 0.987181325089
 459 / 578 The train loss 0.986209080183
 460 / 578 The train loss 0.985429023344
 461 / 578 The train loss 0.984693140187
 462 / 578 The train loss 0.983678822595
 463 / 578 The train loss 0.982777446903
 464 / 578 The train loss 0.981207830069
 465 / 578 The train loss 0.979650685488
 466 / 578 The train loss 0.977975995284
 467 / 578 The train loss 0.976287518606
 468 / 578 The train loss 0.974684534204
 469 / 578 The train loss 0.973286710862
 470 / 578 The train loss 0.97199663955
 471 / 578 The train loss 0.970721052497
 472 / 578 The train loss 0.969397984737
 473 / 578 The train loss 0.968062688736
 474 / 578 The train loss 0.966676954769
 475 / 578 The train loss 0.965331079685
 476 / 578 The train loss 0.964069623758
 477 / 578 The train loss 0.962710489369
 478 / 578 The train loss 0.961094240771
 479 / 578 The train loss 0.959276399783
 480 / 578 The train loss 0.957576576659
 481 / 578 The train loss 0.955858945328
 482 / 578 The train loss 0.954143200649
 483 / 578 The train loss 0.952366387043
 484 / 578 The train loss 0.950708390491
 485 / 578 The train loss 0.949097340109
 486 / 578 The train loss 0.94749135813
 487 / 578 The train loss 0.945863470076
 488 / 578 The train loss 0.944236460562
 489 / 578 The train loss 0.942557233068
 490 / 578 The train loss 0.940843172425
 491 / 578 The train loss 0.939259775142
 492 / 578 The train loss 0.937661493326
 493 / 578 The train loss 0.93611747074
 494 / 578 The train loss 0.934498631663
 495 / 578 The train loss 0.933029995625
 496 / 578 The train loss 0.931885480603
 497 / 578 The train loss 0.930800301051
 498 / 578 The train loss 0.929698712937
 499 / 578 The train loss 0.928585990181
 500 / 578 The train loss 0.927231447779
 501 / 578 The train loss 0.925735812932
 502 / 578 The train loss 0.924335389966
 503 / 578 The train loss 0.923057803984
 504 / 578 The train loss 0.921881728163
 505 / 578 The train loss 0.920627906587
 506 / 578 The train loss 0.91940762149
 507 / 578 The train loss 0.91815698005
 508 / 578 The train loss 0.916668948395
 509 / 578 The train loss 0.915110206169
 510 / 578 The train loss 0.913503255732
 511 / 578 The train loss 0.911928485481
 512 / 578 The train loss 0.910346303826
 513 / 578 The train loss 0.908743317444
 514 / 578 The train loss 0.907194568149
 515 / 578 The train loss 0.905732043804
 516 / 578 The train loss 0.904343139243
 517 / 578 The train loss 0.903025216252
 518 / 578 The train loss 0.901651127926
 519 / 578 The train loss 0.900228165255
 520 / 578 The train loss 0.898828255965
 521 / 578 The train loss 0.897469913555
 522 / 578 The train loss 0.895987816362
 523 / 578 The train loss 0.894623073812
 524 / 578 The train loss 0.893512066887
 525 / 578 The train loss 0.892351353857
 526 / 578 The train loss 0.8912464186
 527 / 578 The train loss 0.890271830046
 528 / 578 The train loss 0.889245353308
 529 / 578 The train loss 0.888250182511
 530 / 578 The train loss 0.88730746334
 531 / 578 The train loss 0.886309666324
 532 / 578 The train loss 0.885212654073
 533 / 578 The train loss 0.884115399615
 534 / 578 The train loss 0.882910789127
 535 / 578 The train loss 0.881545245459
 536 / 578 The train loss 0.880239088095
 537 / 578 The train loss 0.878993700787
 538 / 578 The train loss 0.877946935573
 539 / 578 The train loss 0.876822710486
 540 / 578 The train loss 0.875625971463
 541 / 578 The train loss 0.874557314162
 542 / 578 The train loss 0.87354319755
 543 / 578 The train loss 0.872392494273
 544 / 578 The train loss 0.871415895581
 545 / 578 The train loss 0.870853705807
 546 / 578 The train loss 0.871336918181
 547 / 578 The train loss 0.872200541573
 548 / 578 The train loss 0.874533966299
 549 / 578 The train loss 0.875729706125
 550 / 578 The train loss 0.876247303709
 551 / 578 The train loss 0.876587738582
 552 / 578 The train loss 0.87659375705
 553 / 578 The train loss 0.876809659321
 554 / 578 The train loss 0.876976281196
 555 / 578 The train loss 0.876792581024
 556 / 578 The train loss 0.876568013501
 557 / 578 The train loss 0.876495969794
 558 / 578 The train loss 0.876282651478
 559 / 578 The train loss 0.876299742942
 560 / 578 The train loss 0.876280471157
 561 / 578 The train loss 0.876363332159
 562 / 578 The train loss 0.876373158344
 563 / 578 The train loss 0.876207868018
 564 / 578 The train loss 0.875168889186
 565 / 578 The train loss 0.874067795705
 566 / 578 The train loss 0.873055170109
 567 / 578 The train loss 0.871960157394
 568 / 578 The train loss 0.870877961989
 569 / 578 The train loss 0.869802049631
 570 / 578 The train loss 0.868651676498
 571 / 578 The train loss 0.867598094013
 572 / 578 The train loss 0.866742386899
 573 / 578 The train loss 0.866877861366
 574 / 578 The train loss 0.866814655093
 575 / 578 The train loss 0.866220947355
 576 / 578 The train loss 0.865813545682
 577 / 578 The train loss 0.865420120641
 578 / 578 The train loss 0.865068930971

Starting epoch 7
Validation:
 1 / 30 The valid loss 0.460619390011
 2 / 30 The valid loss 0.434220254421
 3 / 30 The valid loss 0.412182827791
 4 / 30 The valid loss 0.389269173145
 5 / 30 The valid loss 0.369528317451
 6 / 30 The valid loss 0.409326553345
 7 / 30 The valid loss 0.523039494242
 8 / 30 The valid loss 0.667545869946
 9 / 30 The valid loss 0.815114484893
 10 / 30 The valid loss 0.895310270786
 11 / 30 The valid loss 0.913519014012
 12 / 30 The valid loss 0.924832443396
 13 / 30 The valid loss 0.933901511706
 14 / 30 The valid loss 0.938810084547
 15 / 30 The valid loss 0.921830789248
 16 / 30 The valid loss 0.914595603943
 17 / 30 The valid loss 0.927754864973
 18 / 30 The valid loss 0.937581181526
 19 / 30 The valid loss 0.937997190576
 20 / 30 The valid loss 0.931871595979
 21 / 30 The valid loss 0.917975854306
 22 / 30 The valid loss 0.905017736283
 23 / 30 The valid loss 0.893271974895
 24 / 30 The valid loss 0.884557388723
 25 / 30 The valid loss 0.876311607361
 26 / 30 The valid loss 0.872761923533
 27 / 30 The valid loss 0.862045413918
 28 / 30 The valid loss 0.842631978648
 29 / 30 The valid loss 0.820557589161
 30 / 30 The valid loss 0.802116503318

Training
 1 / 578 The train loss 0.545511305332
 2 / 578 The train loss 0.429003953934
 3 / 578 The train loss 0.370063503583
 4 / 578 The train loss 0.349467344582
 5 / 578 The train loss 0.339624977112
 6 / 578 The train loss 0.314291191598
 7 / 578 The train loss 0.290816645537
 8 / 578 The train loss 0.268902912736
 9 / 578 The train loss 0.249025240541
 10 / 578 The train loss 0.234442157298
 11 / 578 The train loss 0.223806204444
 12 / 578 The train loss 0.21508252124
 13 / 578 The train loss 0.212106788388
 14 / 578 The train loss 0.221281546567
 15 / 578 The train loss 0.223086054126
 16 / 578 The train loss 0.225264702924
 17 / 578 The train loss 0.223527194823
 18 / 578 The train loss 0.218922045496
 19 / 578 The train loss 0.212653293029
 20 / 578 The train loss 0.206082719564
 21 / 578 The train loss 0.199617974815
 22 / 578 The train loss 0.193872708489
 23 / 578 The train loss 0.187811118429
 24 / 578 The train loss 0.183247463778
 25 / 578 The train loss 0.1799192518
 26 / 578 The train loss 0.17634577734
 27 / 578 The train loss 0.173805183283
 28 / 578 The train loss 0.172112909279
 29 / 578 The train loss 0.169889001497
 30 / 578 The train loss 0.167571807653
 31 / 578 The train loss 0.167233176049
 32 / 578 The train loss 0.165692131035
 33 / 578 The train loss 0.165454849149
 34 / 578 The train loss 0.165961085873
 35 / 578 The train loss 0.169078024796
 36 / 578 The train loss 0.173634384241
 37 / 578 The train loss 0.177712702268
 38 / 578 The train loss 0.181239784548
 39 / 578 The train loss 0.182243063282
 40 / 578 The train loss 0.182575846091
 41 / 578 The train loss 0.182306158107
 42 / 578 The train loss 0.181886247226
 43 / 578 The train loss 0.181905237048
 44 / 578 The train loss 0.183511325582
 45 / 578 The train loss 0.184663411644
 46 / 578 The train loss 0.184230680375
 47 / 578 The train loss 0.182690494555
 48 / 578 The train loss 0.181242321618
 49 / 578 The train loss 0.180258127804
 50 / 578 The train loss 0.18105233103
 51 / 578 The train loss 0.183164953309
 52 / 578 The train loss 0.185183171756
 53 / 578 The train loss 0.186071251642
 54 / 578 The train loss 0.185196018053
 55 / 578 The train loss 0.184975505417
 56 / 578 The train loss 0.184185754242
 57 / 578 The train loss 0.185501535995
 58 / 578 The train loss 0.187575330765
 59 / 578 The train loss 0.190976651291
 60 / 578 The train loss 0.198099903514
 61 / 578 The train loss 0.206946667345
 62 / 578 The train loss 0.215683314348
 63 / 578 The train loss 0.223295758877
 64 / 578 The train loss 0.228637626627
 65 / 578 The train loss 0.231928134194
 66 / 578 The train loss 0.233423070248
 67 / 578 The train loss 0.234719742368
 68 / 578 The train loss 0.236055682468
 69 / 578 The train loss 0.236717572031
 70 / 578 The train loss 0.240349719567
 71 / 578 The train loss 0.244396343407
 72 / 578 The train loss 0.24973008554
 73 / 578 The train loss 0.254251834669
 74 / 578 The train loss 0.257896160959
 75 / 578 The train loss 0.261919086973
 76 / 578 The train loss 0.264263897154
 77 / 578 The train loss 0.265038719618
 78 / 578 The train loss 0.267000014774
 79 / 578 The train loss 0.27007003204
 80 / 578 The train loss 0.271396568976
 81 / 578 The train loss 0.271790295287
 82 / 578 The train loss 0.271369457063
 83 / 578 The train loss 0.272375095142
 84 / 578 The train loss 0.273438992777
 85 / 578 The train loss 0.274170907631
 86 / 578 The train loss 0.274185239402
 87 / 578 The train loss 0.274348327312
 88 / 578 The train loss 0.273539550941
 89 / 578 The train loss 0.275812881046
 90 / 578 The train loss 0.276074615121
 91 / 578 The train loss 0.274979636892
 92 / 578 The train loss 0.273244884594
 93 / 578 The train loss 0.272232928824
 94 / 578 The train loss 0.27042172112
 95 / 578 The train loss 0.268941737398
 96 / 578 The train loss 0.267093756779
 97 / 578 The train loss 0.265903165276
 98 / 578 The train loss 0.2650885899
 99 / 578 The train loss 0.265391801719
 100 / 578 The train loss 0.265731997266
 101 / 578 The train loss 0.26610298772
 102 / 578 The train loss 0.266246990903
 103 / 578 The train loss 0.265873862051
 104 / 578 The train loss 0.26489859669
 105 / 578 The train loss 0.264466143435
 106 / 578 The train loss 0.264142333407
 107 / 578 The train loss 0.264409099853
 108 / 578 The train loss 0.266263992215
 109 / 578 The train loss 0.266422220395
 110 / 578 The train loss 0.266779589179
 111 / 578 The train loss 0.267304745947
 112 / 578 The train loss 0.268521816669
 113 / 578 The train loss 0.269409461161
 114 / 578 The train loss 0.270660116811
 115 / 578 The train loss 0.272250848814
 116 / 578 The train loss 0.273850000431
 117 / 578 The train loss 0.276869644276
 118 / 578 The train loss 0.280957738537
 119 / 578 The train loss 0.285262548435
 120 / 578 The train loss 0.289083508961
 121 / 578 The train loss 0.293286650449
 122 / 578 The train loss 0.298588149188
 123 / 578 The train loss 0.306035353219
 124 / 578 The train loss 0.315486314136
 125 / 578 The train loss 0.325150739968
 126 / 578 The train loss 0.334152727196
 127 / 578 The train loss 0.343433174855
 128 / 578 The train loss 0.354024140455
 129 / 578 The train loss 0.364989220923
 130 / 578 The train loss 0.377591337894
 131 / 578 The train loss 0.386137346559
 132 / 578 The train loss 0.390411526247
 133 / 578 The train loss 0.39241076709
 134 / 578 The train loss 0.394466338596
 135 / 578 The train loss 0.397818302501
 136 / 578 The train loss 0.402303895803
 137 / 578 The train loss 0.407256952439
 138 / 578 The train loss 0.410087786129
 139 / 578 The train loss 0.412111470007
 140 / 578 The train loss 0.412757311389
 141 / 578 The train loss 0.412641345662
 142 / 578 The train loss 0.41289737616
 143 / 578 The train loss 0.414768344925
 144 / 578 The train loss 0.416472123367
 145 / 578 The train loss 0.417820146628
 146 / 578 The train loss 0.418131531969
 147 / 578 The train loss 0.418334907466
 148 / 578 The train loss 0.417272276824
 149 / 578 The train loss 0.416108605896
 150 / 578 The train loss 0.4154146038
 151 / 578 The train loss 0.413404357256
 152 / 578 The train loss 0.4115686298
 153 / 578 The train loss 0.409518391917
 154 / 578 The train loss 0.407770374717
 155 / 578 The train loss 0.407056624995
 156 / 578 The train loss 0.406273902752
 157 / 578 The train loss 0.405801393187
 158 / 578 The train loss 0.405626184531
 159 / 578 The train loss 0.405011961248
 160 / 578 The train loss 0.403722262243
 161 / 578 The train loss 0.402568935098
 162 / 578 The train loss 0.401260563575
 163 / 578 The train loss 0.400270597289
 164 / 578 The train loss 0.399045935082
 165 / 578 The train loss 0.397833828357
 166 / 578 The train loss 0.397198410323
 167 / 578 The train loss 0.39588331797
 168 / 578 The train loss 0.398427947424
 169 / 578 The train loss 0.406791889235
 170 / 578 The train loss 0.414628817535
 171 / 578 The train loss 0.422990673946
 172 / 578 The train loss 0.430196692561
 173 / 578 The train loss 0.437270524659
 174 / 578 The train loss 0.444335230937
 175 / 578 The train loss 0.451915257105
 176 / 578 The train loss 0.458308998622
 177 / 578 The train loss 0.463793858086
 178 / 578 The train loss 0.466858294363
 179 / 578 The train loss 0.470929344065
 180 / 578 The train loss 0.473800587447
 181 / 578 The train loss 0.475969945981
 182 / 578 The train loss 0.477021281659
 183 / 578 The train loss 0.478057501868
 184 / 578 The train loss 0.48243410444
 185 / 578 The train loss 0.487505414156
 186 / 578 The train loss 0.496259207487
 187 / 578 The train loss 0.509137237893
 188 / 578 The train loss 0.520426184692
 189 / 578 The train loss 0.53151787916
 190 / 578 The train loss 0.546055668396
 191 / 578 The train loss 0.557923008751
 192 / 578 The train loss 0.568048843609
 193 / 578 The train loss 0.576563370776
 194 / 578 The train loss 0.582923927988
 195 / 578 The train loss 0.587099945354
 196 / 578 The train loss 0.591702700992
 197 / 578 The train loss 0.599638228603
 198 / 578 The train loss 0.60383791104
 199 / 578 The train loss 0.608599296879
 200 / 578 The train loss 0.613748274557
 201 / 578 The train loss 0.623322906297
 202 / 578 The train loss 0.637567614085
 203 / 578 The train loss 0.649499539194
 204 / 578 The train loss 0.658569645173
 205 / 578 The train loss 0.665691927693
 206 / 578 The train loss 0.668504213932
 207 / 578 The train loss 0.671415218517
 208 / 578 The train loss 0.679020000299
 209 / 578 The train loss 0.689931137247
 210 / 578 The train loss 0.697878515259
 211 / 578 The train loss 0.704395419719
 212 / 578 The train loss 0.711268966033
 213 / 578 The train loss 0.717052097157
 214 / 578 The train loss 0.721241473142
 215 / 578 The train loss 0.725325554575
 216 / 578 The train loss 0.728429297158
 217 / 578 The train loss 0.729713607597
 218 / 578 The train loss 0.731261651553
 219 / 578 The train loss 0.734127955996
 220 / 578 The train loss 0.73709320978
 221 / 578 The train loss 0.741094921071
 222 / 578 The train loss 0.744932890576
 223 / 578 The train loss 0.749300098306
 224 / 578 The train loss 0.753774350343
 225 / 578 The train loss 0.760051363938
 226 / 578 The train loss 0.767130707316
 227 / 578 The train loss 0.77427364981
 228 / 578 The train loss 0.779749283763
 229 / 578 The train loss 0.785972107023
 230 / 578 The train loss 0.792471732361
 231 / 578 The train loss 0.798758053773
 232 / 578 The train loss 0.804546664597
 233 / 578 The train loss 0.810237965455
 234 / 578 The train loss 0.817244743785
 235 / 578 The train loss 0.825973847667
 236 / 578 The train loss 0.831475991408
 237 / 578 The train loss 0.832988646054
 238 / 578 The train loss 0.832970769559
 239 / 578 The train loss 0.832883550626
 240 / 578 The train loss 0.833125152284
 241 / 578 The train loss 0.833982081427
 242 / 578 The train loss 0.835903193925
 243 / 578 The train loss 0.837994110529
 244 / 578 The train loss 0.841014914917
 245 / 578 The train loss 0.844513819747
 246 / 578 The train loss 0.84764455595
 247 / 578 The train loss 0.848014033366
 248 / 578 The train loss 0.848030672288
 249 / 578 The train loss 0.847505554946
 250 / 578 The train loss 0.849056098789
 251 / 578 The train loss 0.85021436912
 252 / 578 The train loss 0.851853143926
 253 / 578 The train loss 0.854175425654
 254 / 578 The train loss 0.859700600316
 255 / 578 The train loss 0.863873027207
 256 / 578 The train loss 0.871159572009
 257 / 578 The train loss 0.875674376547
 258 / 578 The train loss 0.879970574419
 259 / 578 The train loss 0.88010503297
 260 / 578 The train loss 0.881572027476
 261 / 578 The train loss 0.881715787182
 262 / 578 The train loss 0.881129397776
 263 / 578 The train loss 0.88137170443
 264 / 578 The train loss 0.882024892888
 265 / 578 The train loss 0.882359072932
 266 / 578 The train loss 0.882301543076
 267 / 578 The train loss 0.882625712075
 268 / 578 The train loss 0.885666331332
 269 / 578 The train loss 0.888889642093
 270 / 578 The train loss 0.893513029234
 271 / 578 The train loss 0.897383062245
 272 / 578 The train loss 0.900520601337
 273 / 578 The train loss 0.901237227186
 274 / 578 The train loss 0.901157161914
 275 / 578 The train loss 0.901826804199
 276 / 578 The train loss 0.90272652787
 277 / 578 The train loss 0.902915344207
 278 / 578 The train loss 0.903003902181
 279 / 578 The train loss 0.902093905782
 280 / 578 The train loss 0.901792515894
 281 / 578 The train loss 0.900871580685
 282 / 578 The train loss 0.900280198175
 283 / 578 The train loss 0.900246971356
 284 / 578 The train loss 0.900014238537
 285 / 578 The train loss 0.899649936353
 286 / 578 The train loss 0.901402937444
 287 / 578 The train loss 0.902008688486
 288 / 578 The train loss 0.902120693147
 289 / 578 The train loss 0.902586755327
 290 / 578 The train loss 0.904663736938
 291 / 578 The train loss 0.90559707011
 292 / 578 The train loss 0.906538382657
 293 / 578 The train loss 0.906885468385
 294 / 578 The train loss 0.907847096193
 295 / 578 The train loss 0.909635413981
 296 / 578 The train loss 0.910379938454
 297 / 578 The train loss 0.911050771097
 298 / 578 The train loss 0.911489290954
 299 / 578 The train loss 0.912472469626
 300 / 578 The train loss 0.912978546893
 301 / 578 The train loss 0.913839856528
 302 / 578 The train loss 0.916290571629
 303 / 578 The train loss 0.917880774602
 304 / 578 The train loss 0.918287293961
 305 / 578 The train loss 0.918796144817
 306 / 578 The train loss 0.919292569818
 307 / 578 The train loss 0.919177208229
 308 / 578 The train loss 0.919467212197
 309 / 578 The train loss 0.919410697923
 310 / 578 The train loss 0.919654840111
 311 / 578 The train loss 0.920311311122
 312 / 578 The train loss 0.921116824023
 313 / 578 The train loss 0.922784723103
 314 / 578 The train loss 0.924412296148
 315 / 578 The train loss 0.92547175248
 316 / 578 The train loss 0.925892327099
 317 / 578 The train loss 0.925825990031
 318 / 578 The train loss 0.926876734708
 319 / 578 The train loss 0.928485052035
 320 / 578 The train loss 0.928756542201
 321 / 578 The train loss 0.92851576411
 322 / 578 The train loss 0.927860717665
 323 / 578 The train loss 0.926540677277
 324 / 578 The train loss 0.924966059602
 325 / 578 The train loss 0.923692543942
 326 / 578 The train loss 0.92173016645
 327 / 578 The train loss 0.919717804987
 328 / 578 The train loss 0.917494678575
 329 / 578 The train loss 0.915256821661
 330 / 578 The train loss 0.912838115453
 331 / 578 The train loss 0.910273581181
 332 / 578 The train loss 0.907729360376
 333 / 578 The train loss 0.905710640053
 334 / 578 The train loss 0.904568572222
 335 / 578 The train loss 0.905546348464
 336 / 578 The train loss 0.907544522036
 337 / 578 The train loss 0.908586736227
 338 / 578 The train loss 0.907276289681
 339 / 578 The train loss 0.905163813327
 340 / 578 The train loss 0.902975072208
 341 / 578 The train loss 0.901547237204
 342 / 578 The train loss 0.899456622635
 343 / 578 The train loss 0.898200780653
 344 / 578 The train loss 0.897123085963
 345 / 578 The train loss 0.895931463807
 346 / 578 The train loss 0.894706933549
 347 / 578 The train loss 0.894199770084
 348 / 578 The train loss 0.894387657508
 349 / 578 The train loss 0.895056206957
 350 / 578 The train loss 0.89530223938
 351 / 578 The train loss 0.895510033389
 352 / 578 The train loss 0.895467803623
 353 / 578 The train loss 0.895430413821
 354 / 578 The train loss 0.895781908563
 355 / 578 The train loss 0.896206696164
 356 / 578 The train loss 0.896839398242
 357 / 578 The train loss 0.897360231926
 358 / 578 The train loss 0.897698699658
 359 / 578 The train loss 0.897789074912
 360 / 578 The train loss 0.898094598183
 361 / 578 The train loss 0.898380965469
 362 / 578 The train loss 0.898370290556
 363 / 578 The train loss 0.898110694691
 364 / 578 The train loss 0.897364844885
 365 / 578 The train loss 0.896534123547
 366 / 578 The train loss 0.894405962441
 367 / 578 The train loss 0.892214953229
 368 / 578 The train loss 0.890106775292
 369 / 578 The train loss 0.88792106695
 370 / 578 The train loss 0.885707430763
 371 / 578 The train loss 0.88356301797
 372 / 578 The train loss 0.88136804947
 373 / 578 The train loss 0.879197979361
 374 / 578 The train loss 0.877002988897
 375 / 578 The train loss 0.874900486281
 376 / 578 The train loss 0.872783514681
 377 / 578 The train loss 0.870688246528
 378 / 578 The train loss 0.868474547786
 379 / 578 The train loss 0.86652196468
 380 / 578 The train loss 0.864502565916
 381 / 578 The train loss 0.862644500776
 382 / 578 The train loss 0.861127546164
 383 / 578 The train loss 0.859543426619
 384 / 578 The train loss 0.857664289186
 385 / 578 The train loss 0.855687915931
 386 / 578 The train loss 0.853786689591
 387 / 578 The train loss 0.851871258529
 388 / 578 The train loss 0.849968019642
 389 / 578 The train loss 0.848254075623
 390 / 578 The train loss 0.846339463767
 391 / 578 The train loss 0.84442423427
 392 / 578 The train loss 0.842478255817
 393 / 578 The train loss 0.840555656161
 394 / 578 The train loss 0.83867056091
 395 / 578 The train loss 0.836735155292
 396 / 578 The train loss 0.835012331699
 397 / 578 The train loss 0.833593397075
 398 / 578 The train loss 0.832306911018
 399 / 578 The train loss 0.830861839604
 400 / 578 The train loss 0.829572755937
 401 / 578 The train loss 0.828179181458
 402 / 578 The train loss 0.82680818003
 403 / 578 The train loss 0.825751041896
 404 / 578 The train loss 0.82481242755
 405 / 578 The train loss 0.824026678171
 406 / 578 The train loss 0.823098504576
 407 / 578 The train loss 0.822298978796
 408 / 578 The train loss 0.821439695289
 409 / 578 The train loss 0.822516948076
 410 / 578 The train loss 0.825355388933
 411 / 578 The train loss 0.827409377226
 412 / 578 The train loss 0.828066942345
 413 / 578 The train loss 0.828681949427
 414 / 578 The train loss 0.829707589576
 415 / 578 The train loss 0.830590716776
 416 / 578 The train loss 0.830307767673
 417 / 578 The train loss 0.829714086434
 418 / 578 The train loss 0.829318998123
 419 / 578 The train loss 0.829390337775
 420 / 578 The train loss 0.829392271542
 421 / 578 The train loss 0.8292008068
 422 / 578 The train loss 0.829082106121
 423 / 578 The train loss 0.829189711305
 424 / 578 The train loss 0.82908726654
 425 / 578 The train loss 0.828842211909
 426 / 578 The train loss 0.828308543426
 427 / 578 The train loss 0.827509412529
 428 / 578 The train loss 0.826548490168
 429 / 578 The train loss 0.825311630217
 430 / 578 The train loss 0.823894585756
 431 / 578 The train loss 0.822314074259
 432 / 578 The train loss 0.820594011702
 433 / 578 The train loss 0.818797021536
 434 / 578 The train loss 0.817052069605
 435 / 578 The train loss 0.815307100648
 436 / 578 The train loss 0.813542769342
 437 / 578 The train loss 0.811898637392
 438 / 578 The train loss 0.810564353969
 439 / 578 The train loss 0.809320456932
 440 / 578 The train loss 0.808147373499
 441 / 578 The train loss 0.806797573094
 442 / 578 The train loss 0.805295059761
 443 / 578 The train loss 0.803983873837
 444 / 578 The train loss 0.803457108914
 445 / 578 The train loss 0.803370876736
 446 / 578 The train loss 0.803567307888
 447 / 578 The train loss 0.80390648068
 448 / 578 The train loss 0.803888765505
 449 / 578 The train loss 0.803627803034
 450 / 578 The train loss 0.803151856073
 451 / 578 The train loss 0.802649015408
 452 / 578 The train loss 0.802256801561
 453 / 578 The train loss 0.801975670314
 454 / 578 The train loss 0.801541425263
 455 / 578 The train loss 0.801015402401
 456 / 578 The train loss 0.800343475329
 457 / 578 The train loss 0.799640141322
 458 / 578 The train loss 0.798867190712
 459 / 578 The train loss 0.798018502392
 460 / 578 The train loss 0.797459601327
 461 / 578 The train loss 0.796798477471
 462 / 578 The train loss 0.795983983784
 463 / 578 The train loss 0.795485720851
 464 / 578 The train loss 0.794715782507
 465 / 578 The train loss 0.7939870812
 466 / 578 The train loss 0.793049426366
 467 / 578 The train loss 0.791974633174
 468 / 578 The train loss 0.790992258586
 469 / 578 The train loss 0.790335276877
 470 / 578 The train loss 0.78984254777
 471 / 578 The train loss 0.789321007251
 472 / 578 The train loss 0.788927996444
 473 / 578 The train loss 0.788400004588
 474 / 578 The train loss 0.787632177281
 475 / 578 The train loss 0.787049670855
 476 / 578 The train loss 0.786458664339
 477 / 578 The train loss 0.785840533947
 478 / 578 The train loss 0.784812011986
 479 / 578 The train loss 0.783582293018
 480 / 578 The train loss 0.782321528415
 481 / 578 The train loss 0.781111549261
 482 / 578 The train loss 0.779895177467
 483 / 578 The train loss 0.778542091457
 484 / 578 The train loss 0.777379935969
 485 / 578 The train loss 0.776094398955
 486 / 578 The train loss 0.774898943319
 487 / 578 The train loss 0.773677416261
 488 / 578 The train loss 0.772515993848
 489 / 578 The train loss 0.771291468567
 490 / 578 The train loss 0.770039046427
 491 / 578 The train loss 0.768744323539
 492 / 578 The train loss 0.767486783343
 493 / 578 The train loss 0.766243378037
 494 / 578 The train loss 0.76502550744
 495 / 578 The train loss 0.763960051514
 496 / 578 The train loss 0.763149658856
 497 / 578 The train loss 0.762412378703
 498 / 578 The train loss 0.761745324313
 499 / 578 The train loss 0.760974380178
 500 / 578 The train loss 0.759968830325
 501 / 578 The train loss 0.758834199731
 502 / 578 The train loss 0.757792389584
 503 / 578 The train loss 0.756897826533
 504 / 578 The train loss 0.755940204967
 505 / 578 The train loss 0.755018819005
 506 / 578 The train loss 0.754048057438
 507 / 578 The train loss 0.753064509305
 508 / 578 The train loss 0.751926471787
 509 / 578 The train loss 0.750675733313
 510 / 578 The train loss 0.749419753889
 511 / 578 The train loss 0.748191867432
 512 / 578 The train loss 0.746937514552
 513 / 578 The train loss 0.745654448395
 514 / 578 The train loss 0.744408295421
 515 / 578 The train loss 0.743210505969
 516 / 578 The train loss 0.742090932094
 517 / 578 The train loss 0.741044250106
 518 / 578 The train loss 0.739970543433
 519 / 578 The train loss 0.738813882525
 520 / 578 The train loss 0.737780488992
 521 / 578 The train loss 0.736758422621
 522 / 578 The train loss 0.735576811586
 523 / 578 The train loss 0.734505760076
 524 / 578 The train loss 0.733598431579
 525 / 578 The train loss 0.732775654984
 526 / 578 The train loss 0.731864765534
 527 / 578 The train loss 0.731063385298
 528 / 578 The train loss 0.730340015277
 529 / 578 The train loss 0.7295866597
 530 / 578 The train loss 0.728886245072
 531 / 578 The train loss 0.728077301219
 532 / 578 The train loss 0.727296065252
 533 / 578 The train loss 0.726500815364
 534 / 578 The train loss 0.72557964764
 535 / 578 The train loss 0.724497625559
 536 / 578 The train loss 0.723416578296
 537 / 578 The train loss 0.722468492421
 538 / 578 The train loss 0.721582466058
 539 / 578 The train loss 0.72069089548
 540 / 578 The train loss 0.719930913927
 541 / 578 The train loss 0.719059604284
 542 / 578 The train loss 0.718275633376
 543 / 578 The train loss 0.717353799883
 544 / 578 The train loss 0.716763706268
 545 / 578 The train loss 0.71634173223
 546 / 578 The train loss 0.717109019114
 547 / 578 The train loss 0.718558050426
 548 / 578 The train loss 0.721306688642
 549 / 578 The train loss 0.722884340412
 550 / 578 The train loss 0.723908192942
 551 / 578 The train loss 0.724700074061
 552 / 578 The train loss 0.725243212351
 553 / 578 The train loss 0.725872486069
 554 / 578 The train loss 0.72654532315
 555 / 578 The train loss 0.726663572771
 556 / 578 The train loss 0.726654000895
 557 / 578 The train loss 0.726729999372
 558 / 578 The train loss 0.726943804739
 559 / 578 The train loss 0.72741741757
 560 / 578 The train loss 0.727844662798
 561 / 578 The train loss 0.728137841777
 562 / 578 The train loss 0.728534203454
 563 / 578 The train loss 0.728643796524
 564 / 578 The train loss 0.727837895717
 565 / 578 The train loss 0.727063050675
 566 / 578 The train loss 0.726234486313
 567 / 578 The train loss 0.725342859213
 568 / 578 The train loss 0.724493062327
 569 / 578 The train loss 0.723619380157
 570 / 578 The train loss 0.722805776218
 571 / 578 The train loss 0.721926959476
 572 / 578 The train loss 0.721225669145
 573 / 578 The train loss 0.721403756814
 574 / 578 The train loss 0.721547748692
 575 / 578 The train loss 0.721232259708
 576 / 578 The train loss 0.721158621231
 577 / 578 The train loss 0.720938510063
 578 / 578 The train loss 0.720923281478

Starting epoch 8
Validation:
 1 / 30 The valid loss 0.463320881128
 2 / 30 The valid loss 0.437654212117
 3 / 30 The valid loss 0.416139225165
 4 / 30 The valid loss 0.394769191742
 5 / 30 The valid loss 0.375873804092
 6 / 30 The valid loss 0.41640885671
 7 / 30 The valid loss 0.530924831118
 8 / 30 The valid loss 0.676041230559
 9 / 30 The valid loss 0.824074493514
 10 / 30 The valid loss 0.904589021206
 11 / 30 The valid loss 0.922967108813
 12 / 30 The valid loss 0.934605439504
 13 / 30 The valid loss 0.943950524697
 14 / 30 The valid loss 0.949057468346
 15 / 30 The valid loss 0.931967373689
 16 / 30 The valid loss 0.924649171531
 17 / 30 The valid loss 0.937871603405
 18 / 30 The valid loss 0.94780217939
 19 / 30 The valid loss 0.948270929487
 20 / 30 The valid loss 0.942089468241
 21 / 30 The valid loss 0.928215296496
 22 / 30 The valid loss 0.915294127031
 23 / 30 The valid loss 0.90353581698
 24 / 30 The valid loss 0.894812352955
 25 / 30 The valid loss 0.88663244009
 26 / 30 The valid loss 0.883165166928
 27 / 30 The valid loss 0.872433735265
 28 / 30 The valid loss 0.852887493159
 29 / 30 The valid loss 0.830529499671
 30 / 30 The valid loss 0.811760478218

Training
 1 / 578 The train loss 0.586513876915
 2 / 578 The train loss 0.480133146048
 3 / 578 The train loss 0.422510027885
 4 / 578 The train loss 0.390700951219
 5 / 578 The train loss 0.383293050528
 6 / 578 The train loss 0.356246280173
 7 / 578 The train loss 0.328091225454
 8 / 578 The train loss 0.299028171226
 9 / 578 The train loss 0.273414613472
 10 / 578 The train loss 0.256548983604
 11 / 578 The train loss 0.245396460322
 12 / 578 The train loss 0.237290914481
 13 / 578 The train loss 0.233158045663
 14 / 578 The train loss 0.238150228879
 15 / 578 The train loss 0.237863724927
 16 / 578 The train loss 0.23624628922
 17 / 578 The train loss 0.23293097624
 18 / 578 The train loss 0.228104000704
 19 / 578 The train loss 0.221379753006
 20 / 578 The train loss 0.213304861449
 21 / 578 The train loss 0.207135540921
 22 / 578 The train loss 0.201269600032
 23 / 578 The train loss 0.195378412209
 24 / 578 The train loss 0.190303828412
 25 / 578 The train loss 0.186512588412
 26 / 578 The train loss 0.182986081649
 27 / 578 The train loss 0.181260738522
 28 / 578 The train loss 0.178861940679
 29 / 578 The train loss 0.175485855418
 30 / 578 The train loss 0.173375096545
 31 / 578 The train loss 0.170387053562
 32 / 578 The train loss 0.167197500705
 33 / 578 The train loss 0.166712480851
 34 / 578 The train loss 0.167384413126
 35 / 578 The train loss 0.171037391147
 36 / 578 The train loss 0.174377462827
 37 / 578 The train loss 0.177942347587
 38 / 578 The train loss 0.180973588538
 39 / 578 The train loss 0.182099273858
 40 / 578 The train loss 0.181988895219
 41 / 578 The train loss 0.181117125673
 42 / 578 The train loss 0.180760338459
 43 / 578 The train loss 0.182125881921
 44 / 578 The train loss 0.183191994628
 45 / 578 The train loss 0.184351378265
 46 / 578 The train loss 0.183497414884
 47 / 578 The train loss 0.182402059674
 48 / 578 The train loss 0.18097901647
 49 / 578 The train loss 0.179711371888
 50 / 578 The train loss 0.180602988824
 51 / 578 The train loss 0.182248857311
 52 / 578 The train loss 0.183966339996
 53 / 578 The train loss 0.184437968751
 54 / 578 The train loss 0.18437756867
 55 / 578 The train loss 0.184242452207
 56 / 578 The train loss 0.183554005343
 57 / 578 The train loss 0.184191186635
 58 / 578 The train loss 0.186670440474
 59 / 578 The train loss 0.190355283666
 60 / 578 The train loss 0.197010994889
 61 / 578 The train loss 0.206644045464
 62 / 578 The train loss 0.215390291846
 63 / 578 The train loss 0.222947688271
 64 / 578 The train loss 0.22842768667
 65 / 578 The train loss 0.231069180083
 66 / 578 The train loss 0.231821266031
 67 / 578 The train loss 0.232828228721
 68 / 578 The train loss 0.233512826209
 69 / 578 The train loss 0.234198458763
 70 / 578 The train loss 0.236151310323
 71 / 578 The train loss 0.240037203622
 72 / 578 The train loss 0.24451609199
 73 / 578 The train loss 0.247866792171
 74 / 578 The train loss 0.251602580368
 75 / 578 The train loss 0.255394268781
 76 / 578 The train loss 0.25696471407
 77 / 578 The train loss 0.257806264299
 78 / 578 The train loss 0.259413123847
 79 / 578 The train loss 0.261182893633
 80 / 578 The train loss 0.262659741146
 81 / 578 The train loss 0.263144548247
 82 / 578 The train loss 0.263084894955
 83 / 578 The train loss 0.264284715909
 84 / 578 The train loss 0.265378465566
 85 / 578 The train loss 0.266205281677
 86 / 578 The train loss 0.265383639265
 87 / 578 The train loss 0.265538680785
 88 / 578 The train loss 0.265760217158
 89 / 578 The train loss 0.268790510891
 90 / 578 The train loss 0.27086851096
 91 / 578 The train loss 0.269043063029
 92 / 578 The train loss 0.267428169067
 93 / 578 The train loss 0.265591070577
 94 / 578 The train loss 0.263505393084
 95 / 578 The train loss 0.262034518899
 96 / 578 The train loss 0.26040465994
 97 / 578 The train loss 0.259476874331
 98 / 578 The train loss 0.258482985997
 99 / 578 The train loss 0.258406493135
 100 / 578 The train loss 0.258071932383
 101 / 578 The train loss 0.257204403566
 102 / 578 The train loss 0.256835729359
 103 / 578 The train loss 0.256453544372
 104 / 578 The train loss 0.255759710768
 105 / 578 The train loss 0.255964044623
 106 / 578 The train loss 0.255976641972
 107 / 578 The train loss 0.256290025417
 108 / 578 The train loss 0.255193932637
 109 / 578 The train loss 0.254177278779
 110 / 578 The train loss 0.253157463026
 111 / 578 The train loss 0.252381669408
 112 / 578 The train loss 0.252220706914
 113 / 578 The train loss 0.251706083561
 114 / 578 The train loss 0.252360496376
 115 / 578 The train loss 0.255056633412
 116 / 578 The train loss 0.257517059077
 117 / 578 The train loss 0.25961532275
 118 / 578 The train loss 0.263100779442
 119 / 578 The train loss 0.266459794386
 120 / 578 The train loss 0.270385663987
 121 / 578 The train loss 0.274589011769
 122 / 578 The train loss 0.279076398182
 123 / 578 The train loss 0.286573706392
 124 / 578 The train loss 0.295700678692
 125 / 578 The train loss 0.305928422719
 126 / 578 The train loss 0.315944738597
 127 / 578 The train loss 0.325562304194
 128 / 578 The train loss 0.335899778438
 129 / 578 The train loss 0.346686378728
 130 / 578 The train loss 0.358347412972
 131 / 578 The train loss 0.367068969367
 132 / 578 The train loss 0.372003838674
 133 / 578 The train loss 0.374305480209
 134 / 578 The train loss 0.376239397438
 135 / 578 The train loss 0.379306894568
 136 / 578 The train loss 0.383207396796
 137 / 578 The train loss 0.38796330741
 138 / 578 The train loss 0.390332835551
 139 / 578 The train loss 0.392049096284
 140 / 578 The train loss 0.392392112767
 141 / 578 The train loss 0.391656699148
 142 / 578 The train loss 0.391583427183
 143 / 578 The train loss 0.393397507577
 144 / 578 The train loss 0.395127246389
 145 / 578 The train loss 0.396418764288
 146 / 578 The train loss 0.397056634525
 147 / 578 The train loss 0.397317444614
 148 / 578 The train loss 0.39614697649
 149 / 578 The train loss 0.395309197848
 150 / 578 The train loss 0.394712951407
 151 / 578 The train loss 0.392619206243
 152 / 578 The train loss 0.390560792637
 153 / 578 The train loss 0.388541705054
 154 / 578 The train loss 0.386820913934
 155 / 578 The train loss 0.386013762244
 156 / 578 The train loss 0.385014688286
 157 / 578 The train loss 0.384417739382
 158 / 578 The train loss 0.383934964037
 159 / 578 The train loss 0.382838752777
 160 / 578 The train loss 0.381589330663
 161 / 578 The train loss 0.380462956378
 162 / 578 The train loss 0.379885543299
 163 / 578 The train loss 0.379164234832
 164 / 578 The train loss 0.377802155571
 165 / 578 The train loss 0.37612568648
 166 / 578 The train loss 0.375200888681
 167 / 578 The train loss 0.374027438261
 168 / 578 The train loss 0.377869957016
 169 / 578 The train loss 0.385743236158
 170 / 578 The train loss 0.394810140593
 171 / 578 The train loss 0.403532598211
 172 / 578 The train loss 0.409850649817
 173 / 578 The train loss 0.417385566894
 174 / 578 The train loss 0.424796704353
 175 / 578 The train loss 0.431616710701
 176 / 578 The train loss 0.438303743341
 177 / 578 The train loss 0.442323729585
 178 / 578 The train loss 0.444929923674
 179 / 578 The train loss 0.448282325505
 180 / 578 The train loss 0.450192058852
 181 / 578 The train loss 0.452040431352
 182 / 578 The train loss 0.453069495733
 183 / 578 The train loss 0.454172582676
 184 / 578 The train loss 0.457793625902
 185 / 578 The train loss 0.462367499942
 186 / 578 The train loss 0.470664680745
 187 / 578 The train loss 0.481246658925
 188 / 578 The train loss 0.492824245106
 189 / 578 The train loss 0.503232716351
 190 / 578 The train loss 0.517580531871
 191 / 578 The train loss 0.530051835642
 192 / 578 The train loss 0.540219287175
 193 / 578 The train loss 0.548504733148
 194 / 578 The train loss 0.554472625966
 195 / 578 The train loss 0.558118848407
 196 / 578 The train loss 0.562752290743
 197 / 578 The train loss 0.569822180789
 198 / 578 The train loss 0.573708535539
 199 / 578 The train loss 0.578207899072
 200 / 578 The train loss 0.583291341159
 201 / 578 The train loss 0.592750594483
 202 / 578 The train loss 0.60579373232
 203 / 578 The train loss 0.616957242945
 204 / 578 The train loss 0.625285768133
 205 / 578 The train loss 0.631037838957
 206 / 578 The train loss 0.63297209876
 207 / 578 The train loss 0.63518748373
 208 / 578 The train loss 0.641632791412
 209 / 578 The train loss 0.650956114603
 210 / 578 The train loss 0.657314985795
 211 / 578 The train loss 0.662735188265
 212 / 578 The train loss 0.668476641828
 213 / 578 The train loss 0.673893382939
 214 / 578 The train loss 0.67765467365
 215 / 578 The train loss 0.681101298419
 216 / 578 The train loss 0.683138643788
 217 / 578 The train loss 0.684187670268
 218 / 578 The train loss 0.68537269801
 219 / 578 The train loss 0.687361814473
 220 / 578 The train loss 0.689510135031
 221 / 578 The train loss 0.693446288635
 222 / 578 The train loss 0.69629919242
 223 / 578 The train loss 0.699851518071
 224 / 578 The train loss 0.703334248841
 225 / 578 The train loss 0.709201041675
 226 / 578 The train loss 0.715783690275
 227 / 578 The train loss 0.722610758609
 228 / 578 The train loss 0.727834024793
 229 / 578 The train loss 0.733231358474
 230 / 578 The train loss 0.738522506619
 231 / 578 The train loss 0.743692487688
 232 / 578 The train loss 0.748864295491
 233 / 578 The train loss 0.754353450939
 234 / 578 The train loss 0.760760042737
 235 / 578 The train loss 0.768521747516
 236 / 578 The train loss 0.773149018902
 237 / 578 The train loss 0.774111994035
 238 / 578 The train loss 0.773085712213
 239 / 578 The train loss 0.772502353579
 240 / 578 The train loss 0.772605175851
 241 / 578 The train loss 0.773632347908
 242 / 578 The train loss 0.774431464424
 243 / 578 The train loss 0.774617372303
 244 / 578 The train loss 0.775180038446
 245 / 578 The train loss 0.776048507572
 246 / 578 The train loss 0.776685837157
 247 / 578 The train loss 0.774890296568
 248 / 578 The train loss 0.773154830936
 249 / 578 The train loss 0.771656364995
 250 / 578 The train loss 0.771996043637
 251 / 578 The train loss 0.771922012442
 252 / 578 The train loss 0.772480202765
 253 / 578 The train loss 0.773132384722
 254 / 578 The train loss 0.774744827168
 255 / 578 The train loss 0.776479272004
 256 / 578 The train loss 0.780271352924
 257 / 578 The train loss 0.783763401428
 258 / 578 The train loss 0.785241238094
 259 / 578 The train loss 0.786836982421
 260 / 578 The train loss 0.787752000371
 261 / 578 The train loss 0.788326821522
 262 / 578 The train loss 0.788290855269
 263 / 578 The train loss 0.787576065955
 264 / 578 The train loss 0.787360820764
 265 / 578 The train loss 0.787184089942
 266 / 578 The train loss 0.787690591501
 267 / 578 The train loss 0.788172765576
 268 / 578 The train loss 0.789175620233
 269 / 578 The train loss 0.791312749016
 270 / 578 The train loss 0.793959090845
 271 / 578 The train loss 0.796410854411
 272 / 578 The train loss 0.796921076979
 273 / 578 The train loss 0.796945839752
 274 / 578 The train loss 0.796606633309
 275 / 578 The train loss 0.796797429824
 276 / 578 The train loss 0.797602025023
 277 / 578 The train loss 0.797337949881
 278 / 578 The train loss 0.797481796684
 279 / 578 The train loss 0.798090813795
 280 / 578 The train loss 0.798671585494
 281 / 578 The train loss 0.798870164053
 282 / 578 The train loss 0.798687320189
 283 / 578 The train loss 0.798693873396
 284 / 578 The train loss 0.798866101305
 285 / 578 The train loss 0.798951236159
 286 / 578 The train loss 0.799116385319
 287 / 578 The train loss 0.799021371942
 288 / 578 The train loss 0.799549995986
 289 / 578 The train loss 0.799692812637
 290 / 578 The train loss 0.800009148041
 291 / 578 The train loss 0.800883337971
 292 / 578 The train loss 0.802276037901
 293 / 578 The train loss 0.802526530718
 294 / 578 The train loss 0.803757205872
 295 / 578 The train loss 0.803979778858
 296 / 578 The train loss 0.805108388294
 297 / 578 The train loss 0.805862493148
 298 / 578 The train loss 0.808076525457
 299 / 578 The train loss 0.809001430383
 300 / 578 The train loss 0.810465189777
 301 / 578 The train loss 0.811918366156
 302 / 578 The train loss 0.814136344631
 303 / 578 The train loss 0.815691048322
 304 / 578 The train loss 0.816671340248
 305 / 578 The train loss 0.818051445838
 306 / 578 The train loss 0.819385538462
 307 / 578 The train loss 0.820072124527
 308 / 578 The train loss 0.821821126228
 309 / 578 The train loss 0.822024712442
 310 / 578 The train loss 0.821914955457
 311 / 578 The train loss 0.823134900253
 312 / 578 The train loss 0.82445856505
 313 / 578 The train loss 0.82633113629
 314 / 578 The train loss 0.82906492535
 315 / 578 The train loss 0.829850962391
 316 / 578 The train loss 0.830469010716
 317 / 578 The train loss 0.831377273067
 318 / 578 The train loss 0.832590456619
 319 / 578 The train loss 0.832721537148
 320 / 578 The train loss 0.832432506571
 321 / 578 The train loss 0.831623700328
 322 / 578 The train loss 0.83005224026
 323 / 578 The train loss 0.828177902917
 324 / 578 The train loss 0.826753504954
 325 / 578 The train loss 0.825422741278
 326 / 578 The train loss 0.823275212416
 327 / 578 The train loss 0.821112850379
 328 / 578 The train loss 0.818903008851
 329 / 578 The train loss 0.816648781764
 330 / 578 The train loss 0.814335011792
 331 / 578 The train loss 0.812087063962
 332 / 578 The train loss 0.809892082037
 333 / 578 The train loss 0.808952760645
 334 / 578 The train loss 0.809020006248
 335 / 578 The train loss 0.811468061994
 336 / 578 The train loss 0.814506730031
 337 / 578 The train loss 0.814735382762
 338 / 578 The train loss 0.813217149033
 339 / 578 The train loss 0.811350196699
 340 / 578 The train loss 0.809371626563
 341 / 578 The train loss 0.807517701984
 342 / 578 The train loss 0.805539874709
 343 / 578 The train loss 0.804340678264
 344 / 578 The train loss 0.803518984521
 345 / 578 The train loss 0.802618840607
 346 / 578 The train loss 0.801733141657
 347 / 578 The train loss 0.801355468061
 348 / 578 The train loss 0.801652220956
 349 / 578 The train loss 0.802008560328
 350 / 578 The train loss 0.802238477096
 351 / 578 The train loss 0.802495203944
 352 / 578 The train loss 0.802817797087
 353 / 578 The train loss 0.80297508115
 354 / 578 The train loss 0.803237256494
 355 / 578 The train loss 0.803481942971
 356 / 578 The train loss 0.804287253942
 357 / 578 The train loss 0.804918298336
 358 / 578 The train loss 0.80560917042
 359 / 578 The train loss 0.805700941473
 360 / 578 The train loss 0.806362541982
 361 / 578 The train loss 0.80718201759
 362 / 578 The train loss 0.807312496099
 363 / 578 The train loss 0.806975547902
 364 / 578 The train loss 0.806290167998
 365 / 578 The train loss 0.805795553745
 366 / 578 The train loss 0.803792072623
 367 / 578 The train loss 0.801819944181
 368 / 578 The train loss 0.799815674444
 369 / 578 The train loss 0.797812541629
 370 / 578 The train loss 0.795818922125
 371 / 578 The train loss 0.793841724106
 372 / 578 The train loss 0.791845305602
 373 / 578 The train loss 0.789852135852
 374 / 578 The train loss 0.787892640085
 375 / 578 The train loss 0.785990695943
 376 / 578 The train loss 0.784131560762
 377 / 578 The train loss 0.78214424648
 378 / 578 The train loss 0.780230427742
 379 / 578 The train loss 0.778514471642
 380 / 578 The train loss 0.776727464403
 381 / 578 The train loss 0.775169437991
 382 / 578 The train loss 0.773753748163
 383 / 578 The train loss 0.772143910938
 384 / 578 The train loss 0.770487082695
 385 / 578 The train loss 0.768771290334
 386 / 578 The train loss 0.767042271735
 387 / 578 The train loss 0.765396347669
 388 / 578 The train loss 0.763691181317
 389 / 578 The train loss 0.762128602972
 390 / 578 The train loss 0.760519162413
 391 / 578 The train loss 0.758832662231
 392 / 578 The train loss 0.757102503555
 393 / 578 The train loss 0.755406785026
 394 / 578 The train loss 0.753775939591
 395 / 578 The train loss 0.75218830533
 396 / 578 The train loss 0.750823355515
 397 / 578 The train loss 0.749600917533
 398 / 578 The train loss 0.748451662704
 399 / 578 The train loss 0.747320313035
 400 / 578 The train loss 0.74618093444
 401 / 578 The train loss 0.745088358157
 402 / 578 The train loss 0.744017513145
 403 / 578 The train loss 0.742973686355
 404 / 578 The train loss 0.74217200034
 405 / 578 The train loss 0.741513705603
 406 / 578 The train loss 0.740762181452
 407 / 578 The train loss 0.739912625668
 408 / 578 The train loss 0.739034644803
 409 / 578 The train loss 0.740269550466
 410 / 578 The train loss 0.742816846527
 411 / 578 The train loss 0.744476296588
 412 / 578 The train loss 0.744893396924
 413 / 578 The train loss 0.745130823524
 414 / 578 The train loss 0.746040162815
 415 / 578 The train loss 0.746792343995
 416 / 578 The train loss 0.746661013118
 417 / 578 The train loss 0.746058850676
 418 / 578 The train loss 0.745390448612
 419 / 578 The train loss 0.745066058433
 420 / 578 The train loss 0.744979423338
 421 / 578 The train loss 0.744712031705
 422 / 578 The train loss 0.744407946376
 423 / 578 The train loss 0.744520785591
 424 / 578 The train loss 0.744265493269
 425 / 578 The train loss 0.743824719454
 426 / 578 The train loss 0.743446543795
 427 / 578 The train loss 0.742616694045
 428 / 578 The train loss 0.741644653396
 429 / 578 The train loss 0.740575592748
 430 / 578 The train loss 0.739232049865
 431 / 578 The train loss 0.73777329111
 432 / 578 The train loss 0.736256895432
 433 / 578 The train loss 0.734670852877
 434 / 578 The train loss 0.733282622371
 435 / 578 The train loss 0.731932393359
 436 / 578 The train loss 0.730553339748
 437 / 578 The train loss 0.729248097776
 438 / 578 The train loss 0.728121087389
 439 / 578 The train loss 0.727059925836
 440 / 578 The train loss 0.725863369681
 441 / 578 The train loss 0.724590520157
 442 / 578 The train loss 0.723372814368
 443 / 578 The train loss 0.72236073208
 444 / 578 The train loss 0.721962546468
 445 / 578 The train loss 0.72198699367
 446 / 578 The train loss 0.722137162796
 447 / 578 The train loss 0.722039453694
 448 / 578 The train loss 0.721802776438
 449 / 578 The train loss 0.721509127971
 450 / 578 The train loss 0.720915373655
 451 / 578 The train loss 0.720292726164
 452 / 578 The train loss 0.719791681193
 453 / 578 The train loss 0.719479905413
 454 / 578 The train loss 0.718839196472
 455 / 578 The train loss 0.718235070193
 456 / 578 The train loss 0.71737870132
 457 / 578 The train loss 0.716628351043
 458 / 578 The train loss 0.716027132052
 459 / 578 The train loss 0.715258926567
 460 / 578 The train loss 0.714652804277
 461 / 578 The train loss 0.714142597573
 462 / 578 The train loss 0.713643737129
 463 / 578 The train loss 0.713402073492
 464 / 578 The train loss 0.713205610844
 465 / 578 The train loss 0.71292955897
 466 / 578 The train loss 0.712374278364
 467 / 578 The train loss 0.711711273164
 468 / 578 The train loss 0.711178404359
 469 / 578 The train loss 0.71080033493
 470 / 578 The train loss 0.710529790747
 471 / 578 The train loss 0.710218267543
 472 / 578 The train loss 0.70975214883
 473 / 578 The train loss 0.709103283077
 474 / 578 The train loss 0.708250232096
 475 / 578 The train loss 0.70741001853
 476 / 578 The train loss 0.706610206006
 477 / 578 The train loss 0.705729490767
 478 / 578 The train loss 0.704584637657
 479 / 578 The train loss 0.703311675751
 480 / 578 The train loss 0.702119458754
 481 / 578 The train loss 0.701167847233
 482 / 578 The train loss 0.700113542818
 483 / 578 The train loss 0.699106137397
 484 / 578 The train loss 0.6980753949
 485 / 578 The train loss 0.696960792875
 486 / 578 The train loss 0.695767668072
 487 / 578 The train loss 0.694570190325
 488 / 578 The train loss 0.693615113515
 489 / 578 The train loss 0.692533499464
 490 / 578 The train loss 0.691586231843
 491 / 578 The train loss 0.69068655987
 492 / 578 The train loss 0.689981143672
 493 / 578 The train loss 0.689037431591
 494 / 578 The train loss 0.688041008639
 495 / 578 The train loss 0.686922094008
 496 / 578 The train loss 0.686035715847
 497 / 578 The train loss 0.685237805331
 498 / 578 The train loss 0.6843908266
 499 / 578 The train loss 0.68370664116
 500 / 578 The train loss 0.68295317211
 501 / 578 The train loss 0.681964665317
 502 / 578 The train loss 0.681148726886
 503 / 578 The train loss 0.680398137525
 504 / 578 The train loss 0.679570374766
 505 / 578 The train loss 0.678699431052
 506 / 578 The train loss 0.677859820359
 507 / 578 The train loss 0.676938035953
 508 / 578 The train loss 0.675972175872
 509 / 578 The train loss 0.674850276263
 510 / 578 The train loss 0.673743844434
 511 / 578 The train loss 0.672690295069
 512 / 578 The train loss 0.671536955248
 513 / 578 The train loss 0.670434520164
 514 / 578 The train loss 0.669457209822
 515 / 578 The train loss 0.66838733122
 516 / 578 The train loss 0.667298991024
 517 / 578 The train loss 0.666358223498
 518 / 578 The train loss 0.665446402016
 519 / 578 The train loss 0.664423105797
 520 / 578 The train loss 0.663400703088
 521 / 578 The train loss 0.662389123415
 522 / 578 The train loss 0.661273265557
 523 / 578 The train loss 0.66020917916
 524 / 578 The train loss 0.659162244191
 525 / 578 The train loss 0.658231942859
 526 / 578 The train loss 0.657283433525
 527 / 578 The train loss 0.656456790773
 528 / 578 The train loss 0.65584801632
 529 / 578 The train loss 0.655159757055
 530 / 578 The train loss 0.654575726931
 531 / 578 The train loss 0.653876318524
 532 / 578 The train loss 0.653114741185
 533 / 578 The train loss 0.652336710005
 534 / 578 The train loss 0.651442263017
 535 / 578 The train loss 0.650492470656
 536 / 578 The train loss 0.649636562031
 537 / 578 The train loss 0.648822209695
 538 / 578 The train loss 0.64819253162
 539 / 578 The train loss 0.647493859674
 540 / 578 The train loss 0.646588603938
 541 / 578 The train loss 0.645771078097
 542 / 578 The train loss 0.64498397222
 543 / 578 The train loss 0.644266150912
 544 / 578 The train loss 0.643662423916
 545 / 578 The train loss 0.643548079621
 546 / 578 The train loss 0.644795539212
 547 / 578 The train loss 0.646561905966
 548 / 578 The train loss 0.649987473139
 549 / 578 The train loss 0.651801219003
 550 / 578 The train loss 0.652852933657
 551 / 578 The train loss 0.653706703808
 552 / 578 The train loss 0.654113997294
 553 / 578 The train loss 0.654609073816
 554 / 578 The train loss 0.655277280419
 555 / 578 The train loss 0.65557807784
 556 / 578 The train loss 0.655706068307
 557 / 578 The train loss 0.656183349841
 558 / 578 The train loss 0.656391092025
 559 / 578 The train loss 0.656721359962
 560 / 578 The train loss 0.657083359075
 561 / 578 The train loss 0.657461609253
 562 / 578 The train loss 0.657966115436
 563 / 578 The train loss 0.658042274541
 564 / 578 The train loss 0.65736759415
 565 / 578 The train loss 0.656784230926
 566 / 578 The train loss 0.65618820576
 567 / 578 The train loss 0.655511236571
 568 / 578 The train loss 0.654732884032
 569 / 578 The train loss 0.653926970807
 570 / 578 The train loss 0.653155207379
 571 / 578 The train loss 0.652561011077
 572 / 578 The train loss 0.652080291329
 573 / 578 The train loss 0.652267961921
 574 / 578 The train loss 0.652399083262
 575 / 578 The train loss 0.652143877077
 576 / 578 The train loss 0.652041648004
 577 / 578 The train loss 0.651980067153
 578 / 578 The train loss 0.651876786389

Starting epoch 9
Validation:
 1 / 30 The valid loss 0.449623316526
 2 / 30 The valid loss 0.423004865646
 3 / 30 The valid loss 0.396767665943
 4 / 30 The valid loss 0.359865427017
 5 / 30 The valid loss 0.3308940202
 6 / 30 The valid loss 0.362461787959
 7 / 30 The valid loss 0.467353550451
 8 / 30 The valid loss 0.604988815263
 9 / 30 The valid loss 0.747355542249
 10 / 30 The valid loss 0.824789525568
 11 / 30 The valid loss 0.840900032358
 12 / 30 The valid loss 0.850778675328
 13 / 30 The valid loss 0.858655912372
 14 / 30 The valid loss 0.862202956208
 15 / 30 The valid loss 0.845962024728
 16 / 30 The valid loss 0.839240537025
 17 / 30 The valid loss 0.852555451148
 18 / 30 The valid loss 0.863202678661
 19 / 30 The valid loss 0.863749633494
 20 / 30 The valid loss 0.857808787376
 21 / 30 The valid loss 0.843305202467
 22 / 30 The valid loss 0.829730912366
 23 / 30 The valid loss 0.817935550342
 24 / 30 The valid loss 0.809358761335
 25 / 30 The valid loss 0.801181922555
 26 / 30 The valid loss 0.79697453116
 27 / 30 The valid loss 0.785773557093
 28 / 30 The valid loss 0.767113844731
 29 / 30 The valid loss 0.747124948378
 30 / 30 The valid loss 0.730864402652

Validation MSE(val_loss): 15.5000451153

Test MSE(test_loss): 4.81319693507
Training
 1 / 578 The train loss 0.510433375835
 2 / 578 The train loss 0.42087328434
 3 / 578 The train loss 0.458673000336
 4 / 578 The train loss 0.49024066329
 5 / 578 The train loss 0.487226843834
 6 / 578 The train loss 0.431470602751
 7 / 578 The train loss 0.389931646841
 8 / 578 The train loss 0.354504168034
 9 / 578 The train loss 0.329666503602
 10 / 578 The train loss 0.315461188555
 11 / 578 The train loss 0.300094673579
 12 / 578 The train loss 0.287275829663
 13 / 578 The train loss 0.280046641827
 14 / 578 The train loss 0.281216091343
 15 / 578 The train loss 0.281915360689
 16 / 578 The train loss 0.284386508167
 17 / 578 The train loss 0.281695151154
 18 / 578 The train loss 0.273207617303
 19 / 578 The train loss 0.264519954198
 20 / 578 The train loss 0.255606216565
 21 / 578 The train loss 0.246026521992
 22 / 578 The train loss 0.237452576106
 23 / 578 The train loss 0.22814243133
 24 / 578 The train loss 0.220514996676
 25 / 578 The train loss 0.215086414292
 26 / 578 The train loss 0.210711836027
 27 / 578 The train loss 0.206130543034
 28 / 578 The train loss 0.202804525316
 29 / 578 The train loss 0.198830639301
 30 / 578 The train loss 0.195683797511
 31 / 578 The train loss 0.19245958887
 32 / 578 The train loss 0.189610553498
 33 / 578 The train loss 0.187194311314
 34 / 578 The train loss 0.186899974997
 35 / 578 The train loss 0.188477308037
 36 / 578 The train loss 0.191232297621
 37 / 578 The train loss 0.193781541459
 38 / 578 The train loss 0.198529536444
 39 / 578 The train loss 0.198994752354
 40 / 578 The train loss 0.197925853403
 41 / 578 The train loss 0.195531352491
 42 / 578 The train loss 0.193170671174
 43 / 578 The train loss 0.193751280608
 44 / 578 The train loss 0.194048681254
 45 / 578 The train loss 0.193200467899
 46 / 578 The train loss 0.1912819418
 47 / 578 The train loss 0.189148912603
 48 / 578 The train loss 0.187691533046
 49 / 578 The train loss 0.18646790456
 50 / 578 The train loss 0.185857457556
 51 / 578 The train loss 0.186364196131
 52 / 578 The train loss 0.187096482608
 53 / 578 The train loss 0.186901630131
 54 / 578 The train loss 0.185968250623
 55 / 578 The train loss 0.184914364361
 56 / 578 The train loss 0.183618446579
 57 / 578 The train loss 0.182418658447
 58 / 578 The train loss 0.182778671288
 59 / 578 The train loss 0.184086155494
 60 / 578 The train loss 0.188251919455
 61 / 578 The train loss 0.194382569737
 62 / 578 The train loss 0.199744808968
 63 / 578 The train loss 0.203825445314
 64 / 578 The train loss 0.205328742421
 65 / 578 The train loss 0.205863208868
 66 / 578 The train loss 0.204776988252
 67 / 578 The train loss 0.203175900726
 68 / 578 The train loss 0.201878285063
 69 / 578 The train loss 0.200625814309
 70 / 578 The train loss 0.200543512111
 71 / 578 The train loss 0.201482978007
 72 / 578 The train loss 0.203587255053
 73 / 578 The train loss 0.204953682673
 74 / 578 The train loss 0.206967394824
 75 / 578 The train loss 0.20917222095
 76 / 578 The train loss 0.210386662033
 77 / 578 The train loss 0.210210459694
 78 / 578 The train loss 0.209609259517
 79 / 578 The train loss 0.211235904359
 80 / 578 The train loss 0.212093106774
 81 / 578 The train loss 0.21284381928
 82 / 578 The train loss 0.214835132154
 83 / 578 The train loss 0.217543901202
 84 / 578 The train loss 0.221305168566
 85 / 578 The train loss 0.224361990589
 86 / 578 The train loss 0.226686046675
 87 / 578 The train loss 0.230223517415
 88 / 578 The train loss 0.232068793785
 89 / 578 The train loss 0.238212438551
 90 / 578 The train loss 0.242760336171
 91 / 578 The train loss 0.242135651921
 92 / 578 The train loss 0.241807260206
 93 / 578 The train loss 0.240927133128
 94 / 578 The train loss 0.239339253627
 95 / 578 The train loss 0.23799429977
 96 / 578 The train loss 0.237011609618
 97 / 578 The train loss 0.235835616141
 98 / 578 The train loss 0.235542559825
 99 / 578 The train loss 0.236128221275
 100 / 578 The train loss 0.23540312456
 101 / 578 The train loss 0.235351441539
 102 / 578 The train loss 0.234874462252
 103 / 578 The train loss 0.234232121174
 104 / 578 The train loss 0.233123258992
 105 / 578 The train loss 0.233168971343
 106 / 578 The train loss 0.233286481509
 107 / 578 The train loss 0.233790553149
 108 / 578 The train loss 0.234523817983
 109 / 578 The train loss 0.235109769509
 110 / 578 The train loss 0.234616957283
 111 / 578 The train loss 0.233622684286
 112 / 578 The train loss 0.232672032917
 113 / 578 The train loss 0.232877123029
 114 / 578 The train loss 0.234299636086
 115 / 578 The train loss 0.237213166269
 116 / 578 The train loss 0.24045575887
 117 / 578 The train loss 0.242531277319
 118 / 578 The train loss 0.247271800587
 119 / 578 The train loss 0.252427005107
 120 / 578 The train loss 0.256742720446
 121 / 578 The train loss 0.261151326728
 122 / 578 The train loss 0.265987305505
 123 / 578 The train loss 0.273328045656
 124 / 578 The train loss 0.282818929938
 125 / 578 The train loss 0.292427038327
 126 / 578 The train loss 0.301305627199
 127 / 578 The train loss 0.310832010611
 128 / 578 The train loss 0.32006791899
 129 / 578 The train loss 0.331023803183
 130 / 578 The train loss 0.342500610664
 131 / 578 The train loss 0.350111817714
 132 / 578 The train loss 0.353781199492
 133 / 578 The train loss 0.355110587608
 134 / 578 The train loss 0.356774754169
 135 / 578 The train loss 0.359935532359
 136 / 578 The train loss 0.364897945033
 137 / 578 The train loss 0.369860822824
 138 / 578 The train loss 0.372374784809
 139 / 578 The train loss 0.373877061477
 140 / 578 The train loss 0.374636118447
 141 / 578 The train loss 0.375368473285
 142 / 578 The train loss 0.376122587492
 143 / 578 The train loss 0.3781989987
 144 / 578 The train loss 0.38032072939
 145 / 578 The train loss 0.381713929169
 146 / 578 The train loss 0.382120455896
 147 / 578 The train loss 0.382432298709
 148 / 578 The train loss 0.382164600242
 149 / 578 The train loss 0.382056175575
 150 / 578 The train loss 0.381652991287
 151 / 578 The train loss 0.379569224039
 152 / 578 The train loss 0.377805493686
 153 / 578 The train loss 0.376060703331
 154 / 578 The train loss 0.374252407207
 155 / 578 The train loss 0.372621667109
 156 / 578 The train loss 0.371220714019
 157 / 578 The train loss 0.370159210495
 158 / 578 The train loss 0.369730787198
 159 / 578 The train loss 0.368982763047
 160 / 578 The train loss 0.368323390943
 161 / 578 The train loss 0.36750182455
 162 / 578 The train loss 0.366319758484
 163 / 578 The train loss 0.365624152638
 164 / 578 The train loss 0.36432662889
 165 / 578 The train loss 0.363036635981
 166 / 578 The train loss 0.361786688229
 167 / 578 The train loss 0.360452847221
 168 / 578 The train loss 0.363891607605
 169 / 578 The train loss 0.371142506963
 170 / 578 The train loss 0.379070510244
 171 / 578 The train loss 0.387292117066
 172 / 578 The train loss 0.393084703154
 173 / 578 The train loss 0.397869178681
 174 / 578 The train loss 0.402863847466
 175 / 578 The train loss 0.408469534272
 176 / 578 The train loss 0.414106538699
 177 / 578 The train loss 0.417557200143
 178 / 578 The train loss 0.419379307993
 179 / 578 The train loss 0.422585397803
 180 / 578 The train loss 0.425244429718
 181 / 578 The train loss 0.426567726053
 182 / 578 The train loss 0.426922201466
 183 / 578 The train loss 0.427773000774
 184 / 578 The train loss 0.431788810579
 185 / 578 The train loss 0.436254579322
 186 / 578 The train loss 0.441416236031
 187 / 578 The train loss 0.449579868097
 188 / 578 The train loss 0.458631266315
 189 / 578 The train loss 0.46814890199
 190 / 578 The train loss 0.480444015337
 191 / 578 The train loss 0.490128092126
 192 / 578 The train loss 0.498789766813
 193 / 578 The train loss 0.505693338088
 194 / 578 The train loss 0.51008695918
 195 / 578 The train loss 0.513157523605
 196 / 578 The train loss 0.516214724716
 197 / 578 The train loss 0.521917090208
 198 / 578 The train loss 0.525404370804
 199 / 578 The train loss 0.529449489258
 200 / 578 The train loss 0.534043375263
 201 / 578 The train loss 0.541532910295
 202 / 578 The train loss 0.55279489792
 203 / 578 The train loss 0.562406693782
 204 / 578 The train loss 0.56883281704
 205 / 578 The train loss 0.572506544076
 206 / 578 The train loss 0.57321671694
 207 / 578 The train loss 0.574222024809
 208 / 578 The train loss 0.578677304958
 209 / 578 The train loss 0.586797995154
 210 / 578 The train loss 0.591616866427
 211 / 578 The train loss 0.595509253698
 212 / 578 The train loss 0.598624520434
 213 / 578 The train loss 0.600669498464
 214 / 578 The train loss 0.601694090631
 215 / 578 The train loss 0.602335683698
 216 / 578 The train loss 0.602189743693
 217 / 578 The train loss 0.601463618238
 218 / 578 The train loss 0.601047284158
 219 / 578 The train loss 0.601076242731
 220 / 578 The train loss 0.600795490235
 221 / 578 The train loss 0.601448235612
 222 / 578 The train loss 0.601648998027
 223 / 578 The train loss 0.601858453508
 224 / 578 The train loss 0.603646967169
 225 / 578 The train loss 0.6071806517
 226 / 578 The train loss 0.610666082636
 227 / 578 The train loss 0.613166344463
 228 / 578 The train loss 0.616847367802
 229 / 578 The train loss 0.619904171049
 230 / 578 The train loss 0.622046323064
 231 / 578 The train loss 0.624662352468
 232 / 578 The train loss 0.627409995253
 233 / 578 The train loss 0.629086393791
 234 / 578 The train loss 0.631699055823
 235 / 578 The train loss 0.636830213357
 236 / 578 The train loss 0.639076301236
 237 / 578 The train loss 0.638451212703
 238 / 578 The train loss 0.63660142152
 239 / 578 The train loss 0.634736219343
 240 / 578 The train loss 0.632568492872
 241 / 578 The train loss 0.630440770567
 242 / 578 The train loss 0.628723856562
 243 / 578 The train loss 0.627693111368
 244 / 578 The train loss 0.626482387157
 245 / 578 The train loss 0.62624594129
 246 / 578 The train loss 0.626570598964
 247 / 578 The train loss 0.625144573939
 248 / 578 The train loss 0.62346269158
 249 / 578 The train loss 0.621960586131
 250 / 578 The train loss 0.621243644036
 251 / 578 The train loss 0.620679653892
 252 / 578 The train loss 0.621494694044
 253 / 578 The train loss 0.622437848087
 254 / 578 The train loss 0.62413096678
 255 / 578 The train loss 0.627080732387
 256 / 578 The train loss 0.631121371094
 257 / 578 The train loss 0.63291701423
 258 / 578 The train loss 0.634711849749
 259 / 578 The train loss 0.635361957885
 260 / 578 The train loss 0.63609156211
 261 / 578 The train loss 0.636284779921
 262 / 578 The train loss 0.636195093315
 263 / 578 The train loss 0.635308828339
 264 / 578 The train loss 0.635249917402
 265 / 578 The train loss 0.635883685026
 266 / 578 The train loss 0.636721535432
 267 / 578 The train loss 0.636732743174
 268 / 578 The train loss 0.637116854906
 269 / 578 The train loss 0.639018079307
 270 / 578 The train loss 0.641762966725
 271 / 578 The train loss 0.644554486718
 272 / 578 The train loss 0.646282755859
 273 / 578 The train loss 0.645804417364
 274 / 578 The train loss 0.64549306963
 275 / 578 The train loss 0.645032927265
 276 / 578 The train loss 0.645143074328
 277 / 578 The train loss 0.64423759086
 278 / 578 The train loss 0.64361062199
 279 / 578 The train loss 0.643006237744
 280 / 578 The train loss 0.642342281268
 281 / 578 The train loss 0.641290950639
 282 / 578 The train loss 0.640146151958
 283 / 578 The train loss 0.639180504635
 284 / 578 The train loss 0.638444652892
 285 / 578 The train loss 0.638158953223
 286 / 578 The train loss 0.638631963471
 287 / 578 The train loss 0.639143857008
 288 / 578 The train loss 0.640206975895
 289 / 578 The train loss 0.641247331097
 290 / 578 The train loss 0.643028457855
 291 / 578 The train loss 0.644957866078
 292 / 578 The train loss 0.646925255484
 293 / 578 The train loss 0.648808638422
 294 / 578 The train loss 0.649597601452
 295 / 578 The train loss 0.649908652176
 296 / 578 The train loss 0.650645649344
 297 / 578 The train loss 0.651480789368
 298 / 578 The train loss 0.651781492417
 299 / 578 The train loss 0.652668479923
 300 / 578 The train loss 0.653795860838
 301 / 578 The train loss 0.655646020225
 302 / 578 The train loss 0.657248814208
 303 / 578 The train loss 0.659313012105
 304 / 578 The train loss 0.661348650557
 305 / 578 The train loss 0.662502233153
 306 / 578 The train loss 0.66373814889
 307 / 578 The train loss 0.66456309129
 308 / 578 The train loss 0.665717739207
 309 / 578 The train loss 0.666660358249
 310 / 578 The train loss 0.667532783392
 311 / 578 The train loss 0.67020754476
 312 / 578 The train loss 0.673293513968
 313 / 578 The train loss 0.675775209775
 314 / 578 The train loss 0.679209905285
 315 / 578 The train loss 0.681817232078
 316 / 578 The train loss 0.684050739407
 317 / 578 The train loss 0.686271700791
 318 / 578 The train loss 0.687490522328
 319 / 578 The train loss 0.688418955294
 320 / 578 The train loss 0.688356328412
 321 / 578 The train loss 0.687835943876
 322 / 578 The train loss 0.686531200016
 323 / 578 The train loss 0.685562264468
 324 / 578 The train loss 0.684502503783
 325 / 578 The train loss 0.68324374752
 326 / 578 The train loss 0.681601724907
 327 / 578 The train loss 0.679857445585
 328 / 578 The train loss 0.678447692206
 329 / 578 The train loss 0.677253464962
 330 / 578 The train loss 0.67760345302
 331 / 578 The train loss 0.679535373623
 332 / 578 The train loss 0.682300442476
 333 / 578 The train loss 0.685213362295
 334 / 578 The train loss 0.689195445003
 335 / 578 The train loss 0.696990697581
 336 / 578 The train loss 0.705870972309
 337 / 578 The train loss 0.708661205144
 338 / 578 The train loss 0.70795576793
 339 / 578 The train loss 0.706277102699
 340 / 578 The train loss 0.704789504787
 341 / 578 The train loss 0.703312108839
 342 / 578 The train loss 0.702106865771
 343 / 578 The train loss 0.701619005135
 344 / 578 The train loss 0.701125496246
 345 / 578 The train loss 0.700705792318
 346 / 578 The train loss 0.699787971585
 347 / 578 The train loss 0.699294265633
 348 / 578 The train loss 0.699404797147
 349 / 578 The train loss 0.700507985983
 350 / 578 The train loss 0.70181789652
 351 / 578 The train loss 0.703042404079
 352 / 578 The train loss 0.7037892609
 353 / 578 The train loss 0.703974209984
 354 / 578 The train loss 0.704704334659
 355 / 578 The train loss 0.706001985057
 356 / 578 The train loss 0.707697855439
 357 / 578 The train loss 0.709264538353
 358 / 578 The train loss 0.710462621288
 359 / 578 The train loss 0.711697891381
 360 / 578 The train loss 0.712795569649
 361 / 578 The train loss 0.713750718523
 362 / 578 The train loss 0.714160296783
 363 / 578 The train loss 0.714047168718
 364 / 578 The train loss 0.713477912006
 365 / 578 The train loss 0.713389521414
 366 / 578 The train loss 0.711674900822
 367 / 578 The train loss 0.710251445306
 368 / 578 The train loss 0.708815418634
 369 / 578 The train loss 0.707195404912
 370 / 578 The train loss 0.705581494043
 371 / 578 The train loss 0.703907308437
 372 / 578 The train loss 0.702283890102
 373 / 578 The train loss 0.700672842947
 374 / 578 The train loss 0.698916033194
 375 / 578 The train loss 0.697202933128
 376 / 578 The train loss 0.695579687881
 377 / 578 The train loss 0.69394390241
 378 / 578 The train loss 0.692240845644
 379 / 578 The train loss 0.690802170386
 380 / 578 The train loss 0.689293191401
 381 / 578 The train loss 0.688101486442
 382 / 578 The train loss 0.687161695129
 383 / 578 The train loss 0.686015436465
 384 / 578 The train loss 0.684611961609
 385 / 578 The train loss 0.683105517087
 386 / 578 The train loss 0.681536923394
 387 / 578 The train loss 0.680003506707
 388 / 578 The train loss 0.678639278099
 389 / 578 The train loss 0.67721184266
 390 / 578 The train loss 0.675685381694
 391 / 578 The train loss 0.674174332553
 392 / 578 The train loss 0.672664184042
 393 / 578 The train loss 0.671146451493
 394 / 578 The train loss 0.669642514001
 395 / 578 The train loss 0.668106241859
 396 / 578 The train loss 0.667019354325
 397 / 578 The train loss 0.66614440434
 398 / 578 The train loss 0.66532293439
 399 / 578 The train loss 0.664384584281
 400 / 578 The train loss 0.663486348367
 401 / 578 The train loss 0.662500516263
 402 / 578 The train loss 0.661450115505
 403 / 578 The train loss 0.660622769098
 404 / 578 The train loss 0.659702833608
 405 / 578 The train loss 0.658982952118
 406 / 578 The train loss 0.658091113389
 407 / 578 The train loss 0.657073339219
 408 / 578 The train loss 0.656104025865
 409 / 578 The train loss 0.657312600106
 410 / 578 The train loss 0.659916153164
 411 / 578 The train loss 0.661507430135
 412 / 578 The train loss 0.661761136142
 413 / 578 The train loss 0.661694878361
 414 / 578 The train loss 0.661816826887
 415 / 578 The train loss 0.661840031642
 416 / 578 The train loss 0.661301418758
 417 / 578 The train loss 0.66065912729
 418 / 578 The train loss 0.659931921694
 419 / 578 The train loss 0.658993079703
 420 / 578 The train loss 0.658377277545
 421 / 578 The train loss 0.657769648712
 422 / 578 The train loss 0.657373853112
 423 / 578 The train loss 0.657362503411
 424 / 578 The train loss 0.657311118485
 425 / 578 The train loss 0.6570287618
 426 / 578 The train loss 0.656848063399
 427 / 578 The train loss 0.656344655714
 428 / 578 The train loss 0.655614128924
 429 / 578 The train loss 0.654557367457
 430 / 578 The train loss 0.653120570288
 431 / 578 The train loss 0.651826797693
 432 / 578 The train loss 0.650623072064
 433 / 578 The train loss 0.649330795831
 434 / 578 The train loss 0.648199143416
 435 / 578 The train loss 0.647162789128
 436 / 578 The train loss 0.645970828408
 437 / 578 The train loss 0.644689041776
 438 / 578 The train loss 0.643682041095
 439 / 578 The train loss 0.64269825949
 440 / 578 The train loss 0.641776948122
 441 / 578 The train loss 0.641007007152
 442 / 578 The train loss 0.640322494599
 443 / 578 The train loss 0.639586384192
 444 / 578 The train loss 0.639302211541
 445 / 578 The train loss 0.639238365953
 446 / 578 The train loss 0.63931122446
 447 / 578 The train loss 0.639290849472
 448 / 578 The train loss 0.639326732976
 449 / 578 The train loss 0.639047185108
 450 / 578 The train loss 0.638362275465
 451 / 578 The train loss 0.637635913998
 452 / 578 The train loss 0.637088457011
 453 / 578 The train loss 0.636717861428
 454 / 578 The train loss 0.636132421313
 455 / 578 The train loss 0.635582453704
 456 / 578 The train loss 0.635008076275
 457 / 578 The train loss 0.634435207014
 458 / 578 The train loss 0.633952276972
 459 / 578 The train loss 0.633400652245
 460 / 578 The train loss 0.632633676967
 461 / 578 The train loss 0.632166105183
 462 / 578 The train loss 0.631979516933
 463 / 578 The train loss 0.632110765735
 464 / 578 The train loss 0.632910552053
 465 / 578 The train loss 0.633433306446
 466 / 578 The train loss 0.633053264493
 467 / 578 The train loss 0.632705076959
 468 / 578 The train loss 0.632322183313
 469 / 578 The train loss 0.632031714266
 470 / 578 The train loss 0.631652307673
 471 / 578 The train loss 0.631306878673
 472 / 578 The train loss 0.631041934781
 473 / 578 The train loss 0.630688142043
 474 / 578 The train loss 0.629912503068
 475 / 578 The train loss 0.629159621036
 476 / 578 The train loss 0.628301439294
 477 / 578 The train loss 0.627306618035
 478 / 578 The train loss 0.62630482539
 479 / 578 The train loss 0.625527890562
 480 / 578 The train loss 0.62463144114
 481 / 578 The train loss 0.62386916792
 482 / 578 The train loss 0.623052488135
 483 / 578 The train loss 0.622338680473
 484 / 578 The train loss 0.621426555986
 485 / 578 The train loss 0.620486739712
 486 / 578 The train loss 0.619478737748
 487 / 578 The train loss 0.618589702669
 488 / 578 The train loss 0.617914000835
 489 / 578 The train loss 0.617176989261
 490 / 578 The train loss 0.616549217431
 491 / 578 The train loss 0.615906464019
 492 / 578 The train loss 0.61521956431
 493 / 578 The train loss 0.614430319469
 494 / 578 The train loss 0.613625546069
 495 / 578 The train loss 0.612724436086
 496 / 578 The train loss 0.612142576119
 497 / 578 The train loss 0.611667505606
 498 / 578 The train loss 0.611060009817
 499 / 578 The train loss 0.61060571859
 500 / 578 The train loss 0.609795745227
 501 / 578 The train loss 0.608989828067
 502 / 578 The train loss 0.608183233925
 503 / 578 The train loss 0.60745540863
 504 / 578 The train loss 0.606903779753
 505 / 578 The train loss 0.606106590933
 506 / 578 The train loss 0.605425109862
 507 / 578 The train loss 0.604560972381
 508 / 578 The train loss 0.603590196457
 509 / 578 The train loss 0.60261013712
 510 / 578 The train loss 0.601559780819
 511 / 578 The train loss 0.600522416551
 512 / 578 The train loss 0.599525384026
 513 / 578 The train loss 0.598544019636
 514 / 578 The train loss 0.597690328481
 515 / 578 The train loss 0.596893894488
 516 / 578 The train loss 0.596127221826
 517 / 578 The train loss 0.59542727109
 518 / 578 The train loss 0.5944733386
 519 / 578 The train loss 0.593495236473
 520 / 578 The train loss 0.592614136769
 521 / 578 The train loss 0.591774780085
 522 / 578 The train loss 0.590817181119
 523 / 578 The train loss 0.589909153408
 524 / 578 The train loss 0.589030468243
 525 / 578 The train loss 0.588253411328
 526 / 578 The train loss 0.587602840563
 527 / 578 The train loss 0.587011722042
 528 / 578 The train loss 0.586411587714
 529 / 578 The train loss 0.585953978815
 530 / 578 The train loss 0.585377046399
 531 / 578 The train loss 0.584845926399
 532 / 578 The train loss 0.584311370216
 533 / 578 The train loss 0.58369407504
 534 / 578 The train loss 0.582863540954
 535 / 578 The train loss 0.58207804593
 536 / 578 The train loss 0.581283783947
 537 / 578 The train loss 0.580541971832
 538 / 578 The train loss 0.579931772487
 539 / 578 The train loss 0.579255554213
 540 / 578 The train loss 0.578662324245
 541 / 578 The train loss 0.577972860484
 542 / 578 The train loss 0.577252494305
 543 / 578 The train loss 0.576487560398
 544 / 578 The train loss 0.575911939298
 545 / 578 The train loss 0.575907390753
 546 / 578 The train loss 0.577222573278
 547 / 578 The train loss 0.578844621413
 548 / 578 The train loss 0.581746198879
 549 / 578 The train loss 0.583525388663
 550 / 578 The train loss 0.584722765186
 551 / 578 The train loss 0.585728283846
 552 / 578 The train loss 0.586330879685
 553 / 578 The train loss 0.587044821877
 554 / 578 The train loss 0.587804353561
 555 / 578 The train loss 0.588198184259
 556 / 578 The train loss 0.588492301343
 557 / 578 The train loss 0.58896061456
 558 / 578 The train loss 0.589312715493
 559 / 578 The train loss 0.589714253674
 560 / 578 The train loss 0.590129439682
 561 / 578 The train loss 0.590414571402
 562 / 578 The train loss 0.590903209347
 563 / 578 The train loss 0.591024620114
 564 / 578 The train loss 0.590572965209
 565 / 578 The train loss 0.590031266572
 566 / 578 The train loss 0.589591580746
 567 / 578 The train loss 0.588961298362
 568 / 578 The train loss 0.588259306608
 569 / 578 The train loss 0.58758009579
 570 / 578 The train loss 0.586910627302
 571 / 578 The train loss 0.586229875309
 572 / 578 The train loss 0.585642747806
 573 / 578 The train loss 0.585523730658
 574 / 578 The train loss 0.585353416605
 575 / 578 The train loss 0.584978169996
 576 / 578 The train loss 0.584729053458
 577 / 578 The train loss 0.584442895522
 578 / 578 The train loss 0.584204631452

Starting epoch 10
Validation:
 1 / 30 The valid loss 0.431452602148
 2 / 30 The valid loss 0.407194644213
 3 / 30 The valid loss 0.382112165292
 4 / 30 The valid loss 0.349907979369
 5 / 30 The valid loss 0.324079704285
 6 / 30 The valid loss 0.356718957424
 7 / 30 The valid loss 0.461995278086
 8 / 30 The valid loss 0.600109353662
 9 / 30 The valid loss 0.742826779683
 10 / 30 The valid loss 0.821242010593
 11 / 30 The valid loss 0.838978691535
 12 / 30 The valid loss 0.850089127819
 13 / 30 The valid loss 0.859013580359
 14 / 30 The valid loss 0.863023642983
 15 / 30 The valid loss 0.846650473277
 16 / 30 The valid loss 0.8397260122
 17 / 30 The valid loss 0.85358973461
 18 / 30 The valid loss 0.86556102501
 19 / 30 The valid loss 0.866845943426
 20 / 30 The valid loss 0.860858133435
 21 / 30 The valid loss 0.846110698723
 22 / 30 The valid loss 0.832284716043
 23 / 30 The valid loss 0.820254937462
 24 / 30 The valid loss 0.811494539181
 25 / 30 The valid loss 0.803849895
 26 / 30 The valid loss 0.799490036873
 27 / 30 The valid loss 0.788260036045
 28 / 30 The valid loss 0.769834603582
 29 / 30 The valid loss 0.749582783415
 30 / 30 The valid loss 0.732721613844

Training
 1 / 578 The train loss 0.502926468849
 2 / 578 The train loss 0.397271022201
 3 / 578 The train loss 0.356914093097
 4 / 578 The train loss 0.32880949229
 5 / 578 The train loss 0.316450423002
 6 / 578 The train loss 0.286282718182
 7 / 578 The train loss 0.268071268286
 8 / 578 The train loss 0.244527613744
 9 / 578 The train loss 0.224851886431
 10 / 578 The train loss 0.208919147402
 11 / 578 The train loss 0.20004045015
 12 / 578 The train loss 0.195041978111
 13 / 578 The train loss 0.201460811954
 14 / 578 The train loss 0.214101598731
 15 / 578 The train loss 0.222971190015
 16 / 578 The train loss 0.226673127152
 17 / 578 The train loss 0.22700625921
 18 / 578 The train loss 0.224129632943
 19 / 578 The train loss 0.218493903154
 20 / 578 The train loss 0.21108893007
 21 / 578 The train loss 0.204109152513
 22 / 578 The train loss 0.197117830034
 23 / 578 The train loss 0.190525883728
 24 / 578 The train loss 0.184018763093
 25 / 578 The train loss 0.178632653356
 26 / 578 The train loss 0.175000063215
 27 / 578 The train loss 0.172434220711
 28 / 578 The train loss 0.169450778514
 29 / 578 The train loss 0.165361426
 30 / 578 The train loss 0.162511385729
 31 / 578 The train loss 0.160154001847
 32 / 578 The train loss 0.157175661065
 33 / 578 The train loss 0.155596541636
 34 / 578 The train loss 0.156184404212
 35 / 578 The train loss 0.157600733212
 36 / 578 The train loss 0.159125955568
 37 / 578 The train loss 0.161483832308
 38 / 578 The train loss 0.163734167422
 39 / 578 The train loss 0.163037609596
 40 / 578 The train loss 0.162241910771
 41 / 578 The train loss 0.160554532961
 42 / 578 The train loss 0.158848800475
 43 / 578 The train loss 0.158177263168
 44 / 578 The train loss 0.158443953165
 45 / 578 The train loss 0.15878684885
 46 / 578 The train loss 0.157183402582
 47 / 578 The train loss 0.155228628916
 48 / 578 The train loss 0.153525311655
 49 / 578 The train loss 0.152109107041
 50 / 578 The train loss 0.151132398993
 51 / 578 The train loss 0.150941237661
 52 / 578 The train loss 0.151833658751
 53 / 578 The train loss 0.151448771498
 54 / 578 The train loss 0.151002481166
 55 / 578 The train loss 0.150609674643
 56 / 578 The train loss 0.148959505771
 57 / 578 The train loss 0.147843381161
 58 / 578 The train loss 0.148065080802
 59 / 578 The train loss 0.148408591621
 60 / 578 The train loss 0.151466856276
 61 / 578 The train loss 0.155233644438
 62 / 578 The train loss 0.159631443
 63 / 578 The train loss 0.162204622158
 64 / 578 The train loss 0.163164459518
 65 / 578 The train loss 0.163977343417
 66 / 578 The train loss 0.162679236947
 67 / 578 The train loss 0.161190419277
 68 / 578 The train loss 0.159741265699
 69 / 578 The train loss 0.158190244944
 70 / 578 The train loss 0.157196299253
 71 / 578 The train loss 0.156962677574
 72 / 578 The train loss 0.158170510653
 73 / 578 The train loss 0.159333493356
 74 / 578 The train loss 0.160954963265
 75 / 578 The train loss 0.164944134305
 76 / 578 The train loss 0.16661851162
 77 / 578 The train loss 0.165760192633
 78 / 578 The train loss 0.165724499724
 79 / 578 The train loss 0.167377976536
 80 / 578 The train loss 0.168315695086
 81 / 578 The train loss 0.169460001413
 82 / 578 The train loss 0.173480311772
 83 / 578 The train loss 0.177756878491
 84 / 578 The train loss 0.181961098109
 85 / 578 The train loss 0.185832248234
 86 / 578 The train loss 0.189459617964
 87 / 578 The train loss 0.194198585319
 88 / 578 The train loss 0.197318121664
 89 / 578 The train loss 0.204113922636
 90 / 578 The train loss 0.208280340706
 91 / 578 The train loss 0.207678990909
 92 / 578 The train loss 0.207375944511
 93 / 578 The train loss 0.206213312484
 94 / 578 The train loss 0.205162111868
 95 / 578 The train loss 0.204403766009
 96 / 578 The train loss 0.203814366483
 97 / 578 The train loss 0.203163226817
 98 / 578 The train loss 0.203089854965
 99 / 578 The train loss 0.203961808385
 100 / 578 The train loss 0.204883572571
 101 / 578 The train loss 0.2052015262
 102 / 578 The train loss 0.20519939894
 103 / 578 The train loss 0.204691093487
 104 / 578 The train loss 0.20430605941
 105 / 578 The train loss 0.20444256007
 106 / 578 The train loss 0.204512992101
 107 / 578 The train loss 0.203550301875
 108 / 578 The train loss 0.203781732289
 109 / 578 The train loss 0.203327615634
 110 / 578 The train loss 0.202538664131
 111 / 578 The train loss 0.202267772745
 112 / 578 The train loss 0.202859206158
 113 / 578 The train loss 0.203509025375
 114 / 578 The train loss 0.205433010042
 115 / 578 The train loss 0.208105718604
 116 / 578 The train loss 0.213228952814
 117 / 578 The train loss 0.217428166165
 118 / 578 The train loss 0.22162219279
 119 / 578 The train loss 0.225700858292
 120 / 578 The train loss 0.231208381585
 121 / 578 The train loss 0.236928568922
 122 / 578 The train loss 0.24262049553
 123 / 578 The train loss 0.250280697202
 124 / 578 The train loss 0.259439889492
 125 / 578 The train loss 0.269612472564
 126 / 578 The train loss 0.279915576919
 127 / 578 The train loss 0.290122173932
 128 / 578 The train loss 0.29932997658
 129 / 578 The train loss 0.311146312935
 130 / 578 The train loss 0.323411778332
 131 / 578 The train loss 0.332520120955
 132 / 578 The train loss 0.335988014513
 133 / 578 The train loss 0.337665901176
 134 / 578 The train loss 0.338379820586
 135 / 578 The train loss 0.340824967209
 136 / 578 The train loss 0.344331899795
 137 / 578 The train loss 0.348528708664
 138 / 578 The train loss 0.350369844559
 139 / 578 The train loss 0.35139097544
 140 / 578 The train loss 0.351227713084
 141 / 578 The train loss 0.350883672661
 142 / 578 The train loss 0.351087119614
 143 / 578 The train loss 0.352234949
 144 / 578 The train loss 0.353750813571
 145 / 578 The train loss 0.354764638918
 146 / 578 The train loss 0.354640105229
 147 / 578 The train loss 0.354883438698
 148 / 578 The train loss 0.354937578199
 149 / 578 The train loss 0.353706184075
 150 / 578 The train loss 0.352771534945
 151 / 578 The train loss 0.350688012625
 152 / 578 The train loss 0.348852077205
 153 / 578 The train loss 0.346933819037
 154 / 578 The train loss 0.345171311532
 155 / 578 The train loss 0.343986846819
 156 / 578 The train loss 0.342919553964
 157 / 578 The train loss 0.342225909684
 158 / 578 The train loss 0.341663784878
 159 / 578 The train loss 0.340813729443
 160 / 578 The train loss 0.340112769208
 161 / 578 The train loss 0.339186289735
 162 / 578 The train loss 0.338059504988
 163 / 578 The train loss 0.337339736087
 164 / 578 The train loss 0.336338923631
 165 / 578 The train loss 0.335318207447
 166 / 578 The train loss 0.334132670795
 167 / 578 The train loss 0.332891856638
 168 / 578 The train loss 0.33737449328
 169 / 578 The train loss 0.34457167346
 170 / 578 The train loss 0.353339612988
 171 / 578 The train loss 0.361321373794
 172 / 578 The train loss 0.366411816047
 173 / 578 The train loss 0.372278593674
 174 / 578 The train loss 0.378986398186
 175 / 578 The train loss 0.384696885326
 176 / 578 The train loss 0.389919706176
 177 / 578 The train loss 0.393289322381
 178 / 578 The train loss 0.395944652074
 179 / 578 The train loss 0.400035657509
 180 / 578 The train loss 0.401981455564
 181 / 578 The train loss 0.40299000379
 182 / 578 The train loss 0.403030850218
 183 / 578 The train loss 0.403292789953
 184 / 578 The train loss 0.406463115577
 185 / 578 The train loss 0.409465361749
 186 / 578 The train loss 0.41242204925
 187 / 578 The train loss 0.415421014941
 188 / 578 The train loss 0.420019167039
 189 / 578 The train loss 0.425887756385
 190 / 578 The train loss 0.436833458335
 191 / 578 The train loss 0.444697250383
 192 / 578 The train loss 0.450983528804
 193 / 578 The train loss 0.455687813943
 194 / 578 The train loss 0.458538061699
 195 / 578 The train loss 0.460153256758
 196 / 578 The train loss 0.463434141338
 197 / 578 The train loss 0.468821781976
 198 / 578 The train loss 0.472984347511
 199 / 578 The train loss 0.477633955322
 200 / 578 The train loss 0.481487833094
 201 / 578 The train loss 0.48752360778
 202 / 578 The train loss 0.497093274876
 203 / 578 The train loss 0.504288736657
 204 / 578 The train loss 0.508079889194
 205 / 578 The train loss 0.509320194092
 206 / 578 The train loss 0.508752889985
 207 / 578 The train loss 0.50798435457
 208 / 578 The train loss 0.508530933679
 209 / 578 The train loss 0.5117956799
 210 / 578 The train loss 0.513345974382
 211 / 578 The train loss 0.513428820235
 212 / 578 The train loss 0.514205734667
 213 / 578 The train loss 0.514450033768
 214 / 578 The train loss 0.514017307435
 215 / 578 The train loss 0.513105325072
 216 / 578 The train loss 0.511641877928
 217 / 578 The train loss 0.509928366813
 218 / 578 The train loss 0.50807597196
 219 / 578 The train loss 0.506623988775
 220 / 578 The train loss 0.504789471846
 221 / 578 The train loss 0.503180312574
 222 / 578 The train loss 0.50183544813
 223 / 578 The train loss 0.50060011976
 224 / 578 The train loss 0.499912332095
 225 / 578 The train loss 0.500087958591
 226 / 578 The train loss 0.500384898988
 227 / 578 The train loss 0.501538542021
 228 / 578 The train loss 0.50263541847
 229 / 578 The train loss 0.503778646493
 230 / 578 The train loss 0.505048594154
 231 / 578 The train loss 0.505629807727
 232 / 578 The train loss 0.506651596664
 233 / 578 The train loss 0.506034547311
 234 / 578 The train loss 0.505981946619
 235 / 578 The train loss 0.505121006246
 236 / 578 The train loss 0.50402976593
 237 / 578 The train loss 0.503233920751
 238 / 578 The train loss 0.503311208395
 239 / 578 The train loss 0.503954158246
 240 / 578 The train loss 0.50334316177
 241 / 578 The train loss 0.501736873243
 242 / 578 The train loss 0.500831392755
 243 / 578 The train loss 0.500497691134
 244 / 578 The train loss 0.500601151149
 245 / 578 The train loss 0.500739697489
 246 / 578 The train loss 0.5011152896
 247 / 578 The train loss 0.499926872778
 248 / 578 The train loss 0.498503397024
 249 / 578 The train loss 0.497947118683
 250 / 578 The train loss 0.497652900174
 251 / 578 The train loss 0.497599246685
 252 / 578 The train loss 0.497870980939
 253 / 578 The train loss 0.499564946125
 254 / 578 The train loss 0.502187310432
 255 / 578 The train loss 0.506170450662
 256 / 578 The train loss 0.51012224397
 257 / 578 The train loss 0.514707039491
 258 / 578 The train loss 0.517516804022
 259 / 578 The train loss 0.518733072652
 260 / 578 The train loss 0.519616333524
 261 / 578 The train loss 0.520465460502
 262 / 578 The train loss 0.520198595049
 263 / 578 The train loss 0.519798107204
 264 / 578 The train loss 0.520600977904
 265 / 578 The train loss 0.520801183146
 266 / 578 The train loss 0.521115102896
 267 / 578 The train loss 0.521261177441
 268 / 578 The train loss 0.522563064012
 269 / 578 The train loss 0.524542901776
 270 / 578 The train loss 0.528266806252
 271 / 578 The train loss 0.530757643379
 272 / 578 The train loss 0.531821257229
 273 / 578 The train loss 0.531990039657
 274 / 578 The train loss 0.531726745644
 275 / 578 The train loss 0.532084093812
 276 / 578 The train loss 0.531535861352
 277 / 578 The train loss 0.531057837275
 278 / 578 The train loss 0.530600569084
 279 / 578 The train loss 0.530455811971
 280 / 578 The train loss 0.53016910219
 281 / 578 The train loss 0.52967625513
 282 / 578 The train loss 0.529090938799
 283 / 578 The train loss 0.528565340731
 284 / 578 The train loss 0.528427227558
 285 / 578 The train loss 0.528579929902
 286 / 578 The train loss 0.529406772275
 287 / 578 The train loss 0.5299057984
 288 / 578 The train loss 0.530439364892
 289 / 578 The train loss 0.53152341482
 290 / 578 The train loss 0.532547635521
 291 / 578 The train loss 0.53378058961
 292 / 578 The train loss 0.534653664924
 293 / 578 The train loss 0.535394750014
 294 / 578 The train loss 0.536525444172
 295 / 578 The train loss 0.53719172181
 296 / 578 The train loss 0.537580916067
 297 / 578 The train loss 0.538761504032
 298 / 578 The train loss 0.539771796335
 299 / 578 The train loss 0.541044522351
 300 / 578 The train loss 0.543194851143
 301 / 578 The train loss 0.544695351075
 302 / 578 The train loss 0.546143955409
 303 / 578 The train loss 0.547238477644
 304 / 578 The train loss 0.548705453997
 305 / 578 The train loss 0.550204631992
 306 / 578 The train loss 0.551563019712
 307 / 578 The train loss 0.552912255707
 308 / 578 The train loss 0.553503007725
 309 / 578 The train loss 0.554480650917
 310 / 578 The train loss 0.555680171773
 311 / 578 The train loss 0.557187422164
 312 / 578 The train loss 0.559527134535
 313 / 578 The train loss 0.562161001404
 314 / 578 The train loss 0.56471222664
 315 / 578 The train loss 0.566724606414
 316 / 578 The train loss 0.568327285314
 317 / 578 The train loss 0.569758198485
 318 / 578 The train loss 0.570973950796
 319 / 578 The train loss 0.572016266878
 320 / 578 The train loss 0.572258778533
 321 / 578 The train loss 0.571791611229
 322 / 578 The train loss 0.570486060731
 323 / 578 The train loss 0.569382789484
 324 / 578 The train loss 0.568194810451
 325 / 578 The train loss 0.566808029654
 326 / 578 The train loss 0.565431533199
 327 / 578 The train loss 0.564028297417
 328 / 578 The train loss 0.562663606766
 329 / 578 The train loss 0.561273820483
 330 / 578 The train loss 0.55999089975
 331 / 578 The train loss 0.558717030492
 332 / 578 The train loss 0.557221883986
 333 / 578 The train loss 0.556630901623
 334 / 578 The train loss 0.557192136662
 335 / 578 The train loss 0.559427216749
 336 / 578 The train loss 0.56127159782
 337 / 578 The train loss 0.560149505569
 338 / 578 The train loss 0.55893288523
 339 / 578 The train loss 0.557664015355
 340 / 578 The train loss 0.55636687889
 341 / 578 The train loss 0.555195040378
 342 / 578 The train loss 0.553764246088
 343 / 578 The train loss 0.552600139812
 344 / 578 The train loss 0.551737294587
 345 / 578 The train loss 0.551125244677
 346 / 578 The train loss 0.550238132208
 347 / 578 The train loss 0.55008319002
 348 / 578 The train loss 0.550210842873
 349 / 578 The train loss 0.550897726944
 350 / 578 The train loss 0.551483183152
 351 / 578 The train loss 0.552481957555
 352 / 578 The train loss 0.552857636633
 353 / 578 The train loss 0.552929908027
 354 / 578 The train loss 0.553176785667
 355 / 578 The train loss 0.553756754069
 356 / 578 The train loss 0.55430716903
 357 / 578 The train loss 0.555019004912
 358 / 578 The train loss 0.555897393779
 359 / 578 The train loss 0.556692610451
 360 / 578 The train loss 0.557082423806
 361 / 578 The train loss 0.557468147727
 362 / 578 The train loss 0.557749598138
 363 / 578 The train loss 0.558109811422
 364 / 578 The train loss 0.557733653083
 365 / 578 The train loss 0.557012376057
 366 / 578 The train loss 0.555853808909
 367 / 578 The train loss 0.554615327982
 368 / 578 The train loss 0.553451236003
 369 / 578 The train loss 0.552414826292
 370 / 578 The train loss 0.551205224755
 371 / 578 The train loss 0.550298552056
 372 / 578 The train loss 0.549445570867
 373 / 578 The train loss 0.548459508773
 374 / 578 The train loss 0.547470088859
 375 / 578 The train loss 0.546559916168
 376 / 578 The train loss 0.545626607575
 377 / 578 The train loss 0.544763010272
 378 / 578 The train loss 0.544045005831
 379 / 578 The train loss 0.543316267716
 380 / 578 The train loss 0.542172230535
 381 / 578 The train loss 0.541384180846
 382 / 578 The train loss 0.540897433973
 383 / 578 The train loss 0.540318618701
 384 / 578 The train loss 0.539338808613
 385 / 578 The train loss 0.538176834225
 386 / 578 The train loss 0.538045898804
 387 / 578 The train loss 0.53832417714
 388 / 578 The train loss 0.538982344686
 389 / 578 The train loss 0.539827308218
 390 / 578 The train loss 0.541113316736
 391 / 578 The train loss 0.541850150117
 392 / 578 The train loss 0.541976542222
 393 / 578 The train loss 0.54215075287
 394 / 578 The train loss 0.54174696933
 395 / 578 The train loss 0.541296110906
 396 / 578 The train loss 0.540830157801
 397 / 578 The train loss 0.540517752385
 398 / 578 The train loss 0.540585754683
 399 / 578 The train loss 0.54096325316
 400 / 578 The train loss 0.540837665079
 401 / 578 The train loss 0.540626391833
 402 / 578 The train loss 0.540422301712
 403 / 578 The train loss 0.540482722824
 404 / 578 The train loss 0.540592572504
 405 / 578 The train loss 0.540955797712
 406 / 578 The train loss 0.540878467987
 407 / 578 The train loss 0.540531359549
 408 / 578 The train loss 0.540049819047
 409 / 578 The train loss 0.542116842772
 410 / 578 The train loss 0.545813208918
 411 / 578 The train loss 0.548178729304
 412 / 578 The train loss 0.548825029853
 413 / 578 The train loss 0.549640456598
 414 / 578 The train loss 0.550888623259
 415 / 578 The train loss 0.551832321277
 416 / 578 The train loss 0.551594785728
 417 / 578 The train loss 0.551101266578
 418 / 578 The train loss 0.550504138123
 419 / 578 The train loss 0.550003263175
 420 / 578 The train loss 0.549519520812
 421 / 578 The train loss 0.549327275063
 422 / 578 The train loss 0.549809862184
 423 / 578 The train loss 0.550717332077
 424 / 578 The train loss 0.551164844223
 425 / 578 The train loss 0.551551920062
 426 / 578 The train loss 0.551563164674
 427 / 578 The train loss 0.551230787071
 428 / 578 The train loss 0.550761072392
 429 / 578 The train loss 0.549971195255
 430 / 578 The train loss 0.549137966985
 431 / 578 The train loss 0.548193691355
 432 / 578 The train loss 0.547138369754
 433 / 578 The train loss 0.546102207171
 434 / 578 The train loss 0.545260362095
 435 / 578 The train loss 0.544359255736
 436 / 578 The train loss 0.543400546801
 437 / 578 The train loss 0.542456399121
 438 / 578 The train loss 0.541933108419
 439 / 578 The train loss 0.541559370485
 440 / 578 The train loss 0.541074640786
 441 / 578 The train loss 0.540541157709
 442 / 578 The train loss 0.540104505779
 443 / 578 The train loss 0.539569278746
 444 / 578 The train loss 0.539723907513
 445 / 578 The train loss 0.540068172747
 446 / 578 The train loss 0.540939951967
 447 / 578 The train loss 0.541373352218
 448 / 578 The train loss 0.541576781052
 449 / 578 The train loss 0.541600034081
 450 / 578 The train loss 0.541389010623
 451 / 578 The train loss 0.541210410212
 452 / 578 The train loss 0.540864200529
 453 / 578 The train loss 0.540514272512
 454 / 578 The train loss 0.540175042628
 455 / 578 The train loss 0.539868688199
 456 / 578 The train loss 0.539420832806
 457 / 578 The train loss 0.539451562088
 458 / 578 The train loss 0.539128645807
 459 / 578 The train loss 0.538709272949
 460 / 578 The train loss 0.538567074753
 461 / 578 The train loss 0.538461573414
 462 / 578 The train loss 0.538271150233
 463 / 578 The train loss 0.538467517497
 464 / 578 The train loss 0.538825789475
 465 / 578 The train loss 0.539156216647
 466 / 578 The train loss 0.539159210271
 467 / 578 The train loss 0.539113898421
 468 / 578 The train loss 0.538762851793
 469 / 578 The train loss 0.538669542979
 470 / 578 The train loss 0.538713209434
 471 / 578 The train loss 0.538689026162
 472 / 578 The train loss 0.538342648651
 473 / 578 The train loss 0.537976658251
 474 / 578 The train loss 0.537519159959
 475 / 578 The train loss 0.536960888921
 476 / 578 The train loss 0.536377025134
 477 / 578 The train loss 0.53551864831
 478 / 578 The train loss 0.534579202759
 479 / 578 The train loss 0.533743942681
 480 / 578 The train loss 0.533001254258
 481 / 578 The train loss 0.532258131129
 482 / 578 The train loss 0.531580660825
 483 / 578 The train loss 0.530854398273
 484 / 578 The train loss 0.530086219349
 485 / 578 The train loss 0.529246060496
 486 / 578 The train loss 0.528538819195
 487 / 578 The train loss 0.527756422567
 488 / 578 The train loss 0.5269709561
 489 / 578 The train loss 0.526267429642
 490 / 578 The train loss 0.525649474462
 491 / 578 The train loss 0.525061504593
 492 / 578 The train loss 0.524586905786
 493 / 578 The train loss 0.524036159191
 494 / 578 The train loss 0.523297204504
 495 / 578 The train loss 0.522533953393
 496 / 578 The train loss 0.522003358673
 497 / 578 The train loss 0.521532136992
 498 / 578 The train loss 0.521088383725
 499 / 578 The train loss 0.520529435302
 500 / 578 The train loss 0.519834869437
 501 / 578 The train loss 0.519192485507
 502 / 578 The train loss 0.518698003121
 503 / 578 The train loss 0.518098937048
 504 / 578 The train loss 0.517396760772
 505 / 578 The train loss 0.51666920565
 506 / 578 The train loss 0.515959873052
 507 / 578 The train loss 0.51539201105
 508 / 578 The train loss 0.514506979039
 509 / 578 The train loss 0.513717975555
 510 / 578 The train loss 0.512848859857
 511 / 578 The train loss 0.512015080623
 512 / 578 The train loss 0.511137865054
 513 / 578 The train loss 0.510257224344
 514 / 578 The train loss 0.509555749581
 515 / 578 The train loss 0.508860021613
 516 / 578 The train loss 0.508207265244
 517 / 578 The train loss 0.507579687846
 518 / 578 The train loss 0.506813400457
 519 / 578 The train loss 0.506032628562
 520 / 578 The train loss 0.505219183103
 521 / 578 The train loss 0.504375906114
 522 / 578 The train loss 0.503491042175
 523 / 578 The train loss 0.502693698879
 524 / 578 The train loss 0.502019413711
 525 / 578 The train loss 0.501337773757
 526 / 578 The train loss 0.500660081341
 527 / 578 The train loss 0.500132852688
 528 / 578 The train loss 0.499747547392
 529 / 578 The train loss 0.499268505673
 530 / 578 The train loss 0.498955042871
 531 / 578 The train loss 0.498581644632
 532 / 578 The train loss 0.498285131473
 533 / 578 The train loss 0.497792048775
 534 / 578 The train loss 0.497295393996
 535 / 578 The train loss 0.496603113148
 536 / 578 The train loss 0.495922133424
 537 / 578 The train loss 0.495234826659
 538 / 578 The train loss 0.494854309094
 539 / 578 The train loss 0.494329307056
 540 / 578 The train loss 0.493790154283
 541 / 578 The train loss 0.493231332856
 542 / 578 The train loss 0.492793505331
 543 / 578 The train loss 0.49241142248
 544 / 578 The train loss 0.492182661574
 545 / 578 The train loss 0.492191670097
 546 / 578 The train loss 0.493627028983
 547 / 578 The train loss 0.495333879437
 548 / 578 The train loss 0.498402329216
 549 / 578 The train loss 0.499950520504
 550 / 578 The train loss 0.501209352193
 551 / 578 The train loss 0.502378371323
 552 / 578 The train loss 0.503302936064
 553 / 578 The train loss 0.504183734048
 554 / 578 The train loss 0.504728442777
 555 / 578 The train loss 0.505115703057
 556 / 578 The train loss 0.505589017082
 557 / 578 The train loss 0.506321179334
 558 / 578 The train loss 0.506714540744
 559 / 578 The train loss 0.507164674669
 560 / 578 The train loss 0.507709235019
 561 / 578 The train loss 0.508250781413
 562 / 578 The train loss 0.508774807829
 563 / 578 The train loss 0.509013764878
 564 / 578 The train loss 0.508568392855
 565 / 578 The train loss 0.508019633275
 566 / 578 The train loss 0.507582737166
 567 / 578 The train loss 0.507250335022
 568 / 578 The train loss 0.506841806472
 569 / 578 The train loss 0.506217292184
 570 / 578 The train loss 0.505590019506
 571 / 578 The train loss 0.505095965299
 572 / 578 The train loss 0.504620324312
 573 / 578 The train loss 0.504661155972
 574 / 578 The train loss 0.50454876189
 575 / 578 The train loss 0.50425451096
 576 / 578 The train loss 0.504095386132
 577 / 578 The train loss 0.503836901946
 578 / 578 The train loss 0.50361374339

Starting epoch 11
Validation:
 1 / 30 The valid loss 0.430199474096
 2 / 30 The valid loss 0.405474439263
 3 / 30 The valid loss 0.378708531459
 4 / 30 The valid loss 0.335772275925
 5 / 30 The valid loss 0.304791176319
 6 / 30 The valid loss 0.333630969127
 7 / 30 The valid loss 0.434569214072
 8 / 30 The valid loss 0.569503821433
 9 / 30 The valid loss 0.709261609448
 10 / 30 The valid loss 0.785347598791
 11 / 30 The valid loss 0.803475515409
 12 / 30 The valid loss 0.81267079711
 13 / 30 The valid loss 0.819989062273
 14 / 30 The valid loss 0.823305972985
 15 / 30 The valid loss 0.80845990181
 16 / 30 The valid loss 0.802762564272
 17 / 30 The valid loss 0.816301889279
 18 / 30 The valid loss 0.826924731334
 19 / 30 The valid loss 0.827641345953
 20 / 30 The valid loss 0.822344475985
 21 / 30 The valid loss 0.807548173836
 22 / 30 The valid loss 0.793464452028
 23 / 30 The valid loss 0.781309402507
 24 / 30 The valid loss 0.772236317396
 25 / 30 The valid loss 0.763131222725
 26 / 30 The valid loss 0.757988326825
 27 / 30 The valid loss 0.746709728682
 28 / 30 The valid loss 0.72892548676
 29 / 30 The valid loss 0.710386893359
 30 / 30 The valid loss 0.695513339341

Validation MSE(val_loss): 14.7503256502

Test MSE(test_loss): 4.32328072875
Training
 1 / 578 The train loss 0.512278556824
 2 / 578 The train loss 0.349244430661
 3 / 578 The train loss 0.288971463839
 4 / 578 The train loss 0.295878715813
 5 / 578 The train loss 0.298770433664
 6 / 578 The train loss 0.28505191952
 7 / 578 The train loss 0.265821622951
 8 / 578 The train loss 0.241382758133
 9 / 578 The train loss 0.220676879088
 10 / 578 The train loss 0.210786686093
 11 / 578 The train loss 0.200479431586
 12 / 578 The train loss 0.19153811348
 13 / 578 The train loss 0.188660960358
 14 / 578 The train loss 0.200489352324
 15 / 578 The train loss 0.209016596774
 16 / 578 The train loss 0.212880739477
 17 / 578 The train loss 0.215444997391
 18 / 578 The train loss 0.21727362896
 19 / 578 The train loss 0.215769311707
 20 / 578 The train loss 0.208702167124
 21 / 578 The train loss 0.201529500563
 22 / 578 The train loss 0.193677536381
 23 / 578 The train loss 0.185948542117
 24 / 578 The train loss 0.180113698977
 25 / 578 The train loss 0.17625305891
 26 / 578 The train loss 0.173105245599
 27 / 578 The train loss 0.169090303006
 28 / 578 The train loss 0.165521704991
 29 / 578 The train loss 0.160814644512
 30 / 578 The train loss 0.157468881023
 31 / 578 The train loss 0.155198795961
 32 / 578 The train loss 0.154300472524
 33 / 578 The train loss 0.154124374473
 34 / 578 The train loss 0.154205853722
 35 / 578 The train loss 0.154767780059
 36 / 578 The train loss 0.155818691943
 37 / 578 The train loss 0.156769397543
 38 / 578 The train loss 0.158449092831
 39 / 578 The train loss 0.158640940077
 40 / 578 The train loss 0.157717495086
 41 / 578 The train loss 0.156165115762
 42 / 578 The train loss 0.154466170862
 43 / 578 The train loss 0.155319696812
 44 / 578 The train loss 0.15591901291
 45 / 578 The train loss 0.154856041612
 46 / 578 The train loss 0.153267329032
 47 / 578 The train loss 0.152077309827
 48 / 578 The train loss 0.150802651925
 49 / 578 The train loss 0.149917819252
 50 / 578 The train loss 0.149411267228
 51 / 578 The train loss 0.149358483981
 52 / 578 The train loss 0.149510354556
 53 / 578 The train loss 0.148430714946
 54 / 578 The train loss 0.147440147241
 55 / 578 The train loss 0.147204932198
 56 / 578 The train loss 0.146650934758
 57 / 578 The train loss 0.146161097086
 58 / 578 The train loss 0.145687359835
 59 / 578 The train loss 0.145708335311
 60 / 578 The train loss 0.147298770305
 61 / 578 The train loss 0.148947768921
 62 / 578 The train loss 0.152835213279
 63 / 578 The train loss 0.157164195139
 64 / 578 The train loss 0.158918918954
 65 / 578 The train loss 0.159421344589
 66 / 578 The train loss 0.157671650956
 67 / 578 The train loss 0.156082015346
 68 / 578 The train loss 0.154339976417
 69 / 578 The train loss 0.152817661645
 70 / 578 The train loss 0.151998471442
 71 / 578 The train loss 0.151438927645
 72 / 578 The train loss 0.150733069905
 73 / 578 The train loss 0.14992965888
 74 / 578 The train loss 0.151288320763
 75 / 578 The train loss 0.154837395623
 76 / 578 The train loss 0.155716931619
 77 / 578 The train loss 0.155315473652
 78 / 578 The train loss 0.155852681336
 79 / 578 The train loss 0.157015788947
 80 / 578 The train loss 0.158343408047
 81 / 578 The train loss 0.160361805977
 82 / 578 The train loss 0.162984708983
 83 / 578 The train loss 0.166504697425
 84 / 578 The train loss 0.170932436074
 85 / 578 The train loss 0.175701643242
 86 / 578 The train loss 0.179539125131
 87 / 578 The train loss 0.183291183613
 88 / 578 The train loss 0.187324980253
 89 / 578 The train loss 0.194967071473
 90 / 578 The train loss 0.201077928622
 91 / 578 The train loss 0.20127690475
 92 / 578 The train loss 0.201631363137
 93 / 578 The train loss 0.201399362836
 94 / 578 The train loss 0.200809285659
 95 / 578 The train loss 0.20048131533
 96 / 578 The train loss 0.200163442777
 97 / 578 The train loss 0.200010050338
 98 / 578 The train loss 0.200919932382
 99 / 578 The train loss 0.201515204207
 100 / 578 The train loss 0.203461069595
 101 / 578 The train loss 0.205841058681
 102 / 578 The train loss 0.206055669936
 103 / 578 The train loss 0.205282815578
 104 / 578 The train loss 0.204993626371
 105 / 578 The train loss 0.205079015078
 106 / 578 The train loss 0.204658318223
 107 / 578 The train loss 0.203737418211
 108 / 578 The train loss 0.202861211142
 109 / 578 The train loss 0.203075157991
 110 / 578 The train loss 0.202436698753
 111 / 578 The train loss 0.202802917761
 112 / 578 The train loss 0.20336268182
 113 / 578 The train loss 0.204471216842
 114 / 578 The train loss 0.206225377998
 115 / 578 The train loss 0.208624028949
 116 / 578 The train loss 0.212506432263
 117 / 578 The train loss 0.216643602723
 118 / 578 The train loss 0.223130807833
 119 / 578 The train loss 0.227793832583
 120 / 578 The train loss 0.232907489237
 121 / 578 The train loss 0.237972908569
 122 / 578 The train loss 0.243235098824
 123 / 578 The train loss 0.250866648915
 124 / 578 The train loss 0.260127036126
 125 / 578 The train loss 0.269424164101
 126 / 578 The train loss 0.279381592403
 127 / 578 The train loss 0.28905893808
 128 / 578 The train loss 0.298914042578
 129 / 578 The train loss 0.309806157727
 130 / 578 The train loss 0.322141466963
 131 / 578 The train loss 0.330687000821
 132 / 578 The train loss 0.334315839859
 133 / 578 The train loss 0.335701431365
 134 / 578 The train loss 0.336502517601
 135 / 578 The train loss 0.338987449184
 136 / 578 The train loss 0.342508613009
 137 / 578 The train loss 0.346763296542
 138 / 578 The train loss 0.348477939589
 139 / 578 The train loss 0.350287468568
 140 / 578 The train loss 0.350969517271
 141 / 578 The train loss 0.350963401496
 142 / 578 The train loss 0.351199127878
 143 / 578 The train loss 0.352993488768
 144 / 578 The train loss 0.354081523126
 145 / 578 The train loss 0.3552160247
 146 / 578 The train loss 0.355849105444
 147 / 578 The train loss 0.355922472353
 148 / 578 The train loss 0.355671094315
 149 / 578 The train loss 0.35519891061
 150 / 578 The train loss 0.355183719633
 151 / 578 The train loss 0.353507768326
 152 / 578 The train loss 0.351967725793
 153 / 578 The train loss 0.350352366412
 154 / 578 The train loss 0.34914642518
 155 / 578 The train loss 0.348572703944
 156 / 578 The train loss 0.347851042409
 157 / 578 The train loss 0.347410050772
 158 / 578 The train loss 0.347106155185
 159 / 578 The train loss 0.346141041485
 160 / 578 The train loss 0.34534178566
 161 / 578 The train loss 0.344490195133
 162 / 578 The train loss 0.343345834368
 163 / 578 The train loss 0.342417757147
 164 / 578 The train loss 0.341380552673
 165 / 578 The train loss 0.340258600529
 166 / 578 The train loss 0.338806878674
 167 / 578 The train loss 0.337737893217
 168 / 578 The train loss 0.342048657382
 169 / 578 The train loss 0.349212890138
 170 / 578 The train loss 0.358087533658
 171 / 578 The train loss 0.366399807834
 172 / 578 The train loss 0.371692000726
 173 / 578 The train loss 0.378494415271
 174 / 578 The train loss 0.385001215045
 175 / 578 The train loss 0.391410870158
 176 / 578 The train loss 0.396955887311
 177 / 578 The train loss 0.400389888476
 178 / 578 The train loss 0.402977443091
 179 / 578 The train loss 0.40689219704
 180 / 578 The train loss 0.410065660822
 181 / 578 The train loss 0.411480028275
 182 / 578 The train loss 0.411636244414
 183 / 578 The train loss 0.412294121011
 184 / 578 The train loss 0.415766393634
 185 / 578 The train loss 0.418556686299
 186 / 578 The train loss 0.424389503285
 187 / 578 The train loss 0.429614794396
 188 / 578 The train loss 0.435392559159
 189 / 578 The train loss 0.441677631475
 190 / 578 The train loss 0.453026355519
 191 / 578 The train loss 0.462662017109
 192 / 578 The train loss 0.469179399884
 193 / 578 The train loss 0.473932103741
 194 / 578 The train loss 0.477687744088
 195 / 578 The train loss 0.479215719408
 196 / 578 The train loss 0.481918192896
 197 / 578 The train loss 0.486487590977
 198 / 578 The train loss 0.488074885126
 199 / 578 The train loss 0.490016072193
 200 / 578 The train loss 0.491722825123
 201 / 578 The train loss 0.495749582059
 202 / 578 The train loss 0.501405297483
 203 / 578 The train loss 0.504651045518
 204 / 578 The train loss 0.505496581934
 205 / 578 The train loss 0.504343983994
 206 / 578 The train loss 0.502830796911
 207 / 578 The train loss 0.501450184346
 208 / 578 The train loss 0.500876365829
 209 / 578 The train loss 0.500995208656
 210 / 578 The train loss 0.500045959749
 211 / 578 The train loss 0.498248227356
 212 / 578 The train loss 0.496643370079
 213 / 578 The train loss 0.495297438548
 214 / 578 The train loss 0.494191317662
 215 / 578 The train loss 0.492593903831
 216 / 578 The train loss 0.490888080379
 217 / 578 The train loss 0.490278478858
 218 / 578 The train loss 0.488715854525
 219 / 578 The train loss 0.487154567478
 220 / 578 The train loss 0.485595297331
 221 / 578 The train loss 0.484263389283
 222 / 578 The train loss 0.482619610615
 223 / 578 The train loss 0.481462024745
 224 / 578 The train loss 0.48068911924
 225 / 578 The train loss 0.479608263928
 226 / 578 The train loss 0.478771212577
 227 / 578 The train loss 0.478682635992
 228 / 578 The train loss 0.479407597881
 229 / 578 The train loss 0.48067371875
 230 / 578 The train loss 0.480853651747
 231 / 578 The train loss 0.480122638431
 232 / 578 The train loss 0.479249291853
 233 / 578 The train loss 0.478411274367
 234 / 578 The train loss 0.478295972674
 235 / 578 The train loss 0.478917699632
 236 / 578 The train loss 0.478926000414
 237 / 578 The train loss 0.47842416943
 238 / 578 The train loss 0.477775381729
 239 / 578 The train loss 0.47730684875
 240 / 578 The train loss 0.47659010846
 241 / 578 The train loss 0.475894407203
 242 / 578 The train loss 0.475339750821
 243 / 578 The train loss 0.474408004717
 244 / 578 The train loss 0.473186394559
 245 / 578 The train loss 0.472874960448
 246 / 578 The train loss 0.472939395649
 247 / 578 The train loss 0.471571497853
 248 / 578 The train loss 0.470553343597
 249 / 578 The train loss 0.469516777871
 250 / 578 The train loss 0.469066550039
 251 / 578 The train loss 0.469762138659
 252 / 578 The train loss 0.470256879711
 253 / 578 The train loss 0.471778176541
 254 / 578 The train loss 0.473804835047
 255 / 578 The train loss 0.476581795331
 256 / 578 The train loss 0.480539886536
 257 / 578 The train loss 0.484191781142
 258 / 578 The train loss 0.486457186405
 259 / 578 The train loss 0.487601744874
 260 / 578 The train loss 0.488303046294
 261 / 578 The train loss 0.489191664611
 262 / 578 The train loss 0.489865194317
 263 / 578 The train loss 0.489445440237
 264 / 578 The train loss 0.490058563897
 265 / 578 The train loss 0.491027126693
 266 / 578 The train loss 0.492199611416
 267 / 578 The train loss 0.492705023488
 268 / 578 The train loss 0.49317779294
 269 / 578 The train loss 0.495422498908
 270 / 578 The train loss 0.499017911949
 271 / 578 The train loss 0.50192090992
 272 / 578 The train loss 0.502347354962
 273 / 578 The train loss 0.502129440472
 274 / 578 The train loss 0.502114832748
 275 / 578 The train loss 0.501573445926
 276 / 578 The train loss 0.501349029894
 277 / 578 The train loss 0.501096837808
 278 / 578 The train loss 0.50053263388
 279 / 578 The train loss 0.499686555857
 280 / 578 The train loss 0.499152851976
 281 / 578 The train loss 0.498879776857
 282 / 578 The train loss 0.498261909344
 283 / 578 The train loss 0.497596754959
 284 / 578 The train loss 0.496994340943
 285 / 578 The train loss 0.496323943158
 286 / 578 The train loss 0.496477256099
 287 / 578 The train loss 0.496734151669
 288 / 578 The train loss 0.49706805934
 289 / 578 The train loss 0.497699680851
 290 / 578 The train loss 0.498780670884
 291 / 578 The train loss 0.500255431525
 292 / 578 The train loss 0.501486191213
 293 / 578 The train loss 0.50153076732
 294 / 578 The train loss 0.501428531058
 295 / 578 The train loss 0.501246075932
 296 / 578 The train loss 0.500843766189
 297 / 578 The train loss 0.500462918489
 298 / 578 The train loss 0.501025808456
 299 / 578 The train loss 0.501365120713
 300 / 578 The train loss 0.502460100371
 301 / 578 The train loss 0.503813102909
 302 / 578 The train loss 0.505928522426
 303 / 578 The train loss 0.507905796139
 304 / 578 The train loss 0.509190294616
 305 / 578 The train loss 0.509918409018
 306 / 578 The train loss 0.510231783498
 307 / 578 The train loss 0.510852460091
 308 / 578 The train loss 0.511864656093
 309 / 578 The train loss 0.513297121423
 310 / 578 The train loss 0.513788596967
 311 / 578 The train loss 0.515016927902
 312 / 578 The train loss 0.517122097045
 313 / 578 The train loss 0.519103621064
 314 / 578 The train loss 0.521118888353
 315 / 578 The train loss 0.52205281969
 316 / 578 The train loss 0.522386967682
 317 / 578 The train loss 0.523340388163
 318 / 578 The train loss 0.524845098286
 319 / 578 The train loss 0.525517046376
 320 / 578 The train loss 0.52529957529
 321 / 578 The train loss 0.52418683485
 322 / 578 The train loss 0.522798777801
 323 / 578 The train loss 0.52151838844
 324 / 578 The train loss 0.520240350908
 325 / 578 The train loss 0.519264401513
 326 / 578 The train loss 0.517885664964
 327 / 578 The train loss 0.516440820308
 328 / 578 The train loss 0.515056228362
 329 / 578 The train loss 0.51364148163
 330 / 578 The train loss 0.512274454257
 331 / 578 The train loss 0.51089301025
 332 / 578 The train loss 0.509493858604
 333 / 578 The train loss 0.508717958181
 334 / 578 The train loss 0.50859810108
 335 / 578 The train loss 0.509920511756
 336 / 578 The train loss 0.510424866114
 337 / 578 The train loss 0.51001661078
 338 / 578 The train loss 0.509077503566
 339 / 578 The train loss 0.50817211319
 340 / 578 The train loss 0.506903652535
 341 / 578 The train loss 0.505985985835
 342 / 578 The train loss 0.504871672809
 343 / 578 The train loss 0.503734052675
 344 / 578 The train loss 0.50263663162
 345 / 578 The train loss 0.501914423677
 346 / 578 The train loss 0.500945481407
 347 / 578 The train loss 0.500101210206
 348 / 578 The train loss 0.499628755526
 349 / 578 The train loss 0.499489956026
 350 / 578 The train loss 0.499506595449
 351 / 578 The train loss 0.49951576233
 352 / 578 The train loss 0.499469725643
 353 / 578 The train loss 0.499047819995
 354 / 578 The train loss 0.498872824197
 355 / 578 The train loss 0.499181581609
 356 / 578 The train loss 0.499457402736
 357 / 578 The train loss 0.499691404105
 358 / 578 The train loss 0.499496058769
 359 / 578 The train loss 0.499271539484
 360 / 578 The train loss 0.499466478716
 361 / 578 The train loss 0.499838579781
 362 / 578 The train loss 0.500190619333
 363 / 578 The train loss 0.500907417556
 364 / 578 The train loss 0.501178620173
 365 / 578 The train loss 0.500475592988
 366 / 578 The train loss 0.500112686179
 367 / 578 The train loss 0.499022762253
 368 / 578 The train loss 0.497917499449
 369 / 578 The train loss 0.497022763435
 370 / 578 The train loss 0.496537528263
 371 / 578 The train loss 0.496370331231
 372 / 578 The train loss 0.496200950411
 373 / 578 The train loss 0.495535830725
 374 / 578 The train loss 0.495020452523
 375 / 578 The train loss 0.494294317658
 376 / 578 The train loss 0.493467897509
 377 / 578 The train loss 0.492489471928
 378 / 578 The train loss 0.491347057494
 379 / 578 The train loss 0.4901752453
 380 / 578 The train loss 0.489024979459
 381 / 578 The train loss 0.488421749837
 382 / 578 The train loss 0.487934103845
 383 / 578 The train loss 0.487156179683
 384 / 578 The train loss 0.486106032501
 385 / 578 The train loss 0.484932907153
 386 / 578 The train loss 0.484158924028
 387 / 578 The train loss 0.483496024025
 388 / 578 The train loss 0.482798494451
 389 / 578 The train loss 0.482047150496
 390 / 578 The train loss 0.481047968466
 391 / 578 The train loss 0.48004802093
 392 / 578 The train loss 0.479110667515
 393 / 578 The train loss 0.478338832917
 394 / 578 The train loss 0.477647420184
 395 / 578 The train loss 0.476692184084
 396 / 578 The train loss 0.476419237823
 397 / 578 The train loss 0.476263864885
 398 / 578 The train loss 0.476354137058
 399 / 578 The train loss 0.476176528848
 400 / 578 The train loss 0.475853389571
 401 / 578 The train loss 0.475485150277
 402 / 578 The train loss 0.474955633723
 403 / 578 The train loss 0.474900932668
 404 / 578 The train loss 0.475232165192
 405 / 578 The train loss 0.475385003174
 406 / 578 The train loss 0.474896014099
 407 / 578 The train loss 0.474396004646
 408 / 578 The train loss 0.473771010612
 409 / 578 The train loss 0.475579911816
 410 / 578 The train loss 0.478716923037
 411 / 578 The train loss 0.480684723075
 412 / 578 The train loss 0.481585966654
 413 / 578 The train loss 0.482367157138
 414 / 578 The train loss 0.483473455772
 415 / 578 The train loss 0.484391325851
 416 / 578 The train loss 0.484608125868
 417 / 578 The train loss 0.484627809309
 418 / 578 The train loss 0.484291855723
 419 / 578 The train loss 0.484349561544
 420 / 578 The train loss 0.484491340954
 421 / 578 The train loss 0.484300616726
 422 / 578 The train loss 0.484538858461
 423 / 578 The train loss 0.485482965162
 424 / 578 The train loss 0.486377992505
 425 / 578 The train loss 0.486938587423
 426 / 578 The train loss 0.487451693887
 427 / 578 The train loss 0.487420694955
 428 / 578 The train loss 0.486886403058
 429 / 578 The train loss 0.486281525661
 430 / 578 The train loss 0.485560083835
 431 / 578 The train loss 0.484773035803
 432 / 578 The train loss 0.4837395258
 433 / 578 The train loss 0.482811965027
 434 / 578 The train loss 0.482202487054
 435 / 578 The train loss 0.481357638568
 436 / 578 The train loss 0.480420718512
 437 / 578 The train loss 0.479649695958
 438 / 578 The train loss 0.479270550057
 439 / 578 The train loss 0.479059976888
 440 / 578 The train loss 0.478813219862
 441 / 578 The train loss 0.478517501106
 442 / 578 The train loss 0.478327511371
 443 / 578 The train loss 0.478078009964
 444 / 578 The train loss 0.478462219754
 445 / 578 The train loss 0.47942507131
 446 / 578 The train loss 0.480606950164
 447 / 578 The train loss 0.481401609474
 448 / 578 The train loss 0.481703886475
 449 / 578 The train loss 0.481775046967
 450 / 578 The train loss 0.481981786999
 451 / 578 The train loss 0.482119788992
 452 / 578 The train loss 0.482148291176
 453 / 578 The train loss 0.482238848128
 454 / 578 The train loss 0.482076966401
 455 / 578 The train loss 0.481860154257
 456 / 578 The train loss 0.481360537225
 457 / 578 The train loss 0.481113479532
 458 / 578 The train loss 0.480887844813
 459 / 578 The train loss 0.480610605009
 460 / 578 The train loss 0.480281286569
 461 / 578 The train loss 0.48002208499
 462 / 578 The train loss 0.479796179812
 463 / 578 The train loss 0.480348776
 464 / 578 The train loss 0.481160161932
 465 / 578 The train loss 0.481714190314
 466 / 578 The train loss 0.482210788584
 467 / 578 The train loss 0.482448632743
 468 / 578 The train loss 0.482477821963
 469 / 578 The train loss 0.482601053913
 470 / 578 The train loss 0.482653266425
 471 / 578 The train loss 0.48259735909
 472 / 578 The train loss 0.482440886273
 473 / 578 The train loss 0.482091634501
 474 / 578 The train loss 0.481687628185
 475 / 578 The train loss 0.481094229421
 476 / 578 The train loss 0.480317678439
 477 / 578 The train loss 0.479524474517
 478 / 578 The train loss 0.478719906948
 479 / 578 The train loss 0.47806872032
 480 / 578 The train loss 0.477591209731
 481 / 578 The train loss 0.476995545151
 482 / 578 The train loss 0.476511304553
 483 / 578 The train loss 0.476000131808
 484 / 578 The train loss 0.475291309459
 485 / 578 The train loss 0.474781488385
 486 / 578 The train loss 0.474217847783
 487 / 578 The train loss 0.473616536389
 488 / 578 The train loss 0.472924619573
 489 / 578 The train loss 0.472332576145
 490 / 578 The train loss 0.471751088995
 491 / 578 The train loss 0.471295823994
 492 / 578 The train loss 0.470887108604
 493 / 578 The train loss 0.470432923056
 494 / 578 The train loss 0.469835456291
 495 / 578 The train loss 0.469265181695
 496 / 578 The train loss 0.468853334171
 497 / 578 The train loss 0.46853232992
 498 / 578 The train loss 0.468209787332
 499 / 578 The train loss 0.467982205243
 500 / 578 The train loss 0.467476594593
 501 / 578 The train loss 0.466834334434
 502 / 578 The train loss 0.466186988694
 503 / 578 The train loss 0.46574352782
 504 / 578 The train loss 0.465374311902
 505 / 578 The train loss 0.464933841473
 506 / 578 The train loss 0.464339210932
 507 / 578 The train loss 0.463848120535
 508 / 578 The train loss 0.46324473009
 509 / 578 The train loss 0.462571961293
 510 / 578 The train loss 0.461846428916
 511 / 578 The train loss 0.461233982525
 512 / 578 The train loss 0.460627726872
 513 / 578 The train loss 0.459885187051
 514 / 578 The train loss 0.459261093531
 515 / 578 The train loss 0.458697044122
 516 / 578 The train loss 0.458182506837
 517 / 578 The train loss 0.45774791509
 518 / 578 The train loss 0.457147339363
 519 / 578 The train loss 0.456483644209
 520 / 578 The train loss 0.455881304118
 521 / 578 The train loss 0.455259869283
 522 / 578 The train loss 0.454473940809
 523 / 578 The train loss 0.453735818445
 524 / 578 The train loss 0.45327908138
 525 / 578 The train loss 0.452938073834
 526 / 578 The train loss 0.452388752085
 527 / 578 The train loss 0.45198387407
 528 / 578 The train loss 0.451468954896
 529 / 578 The train loss 0.451217683666
 530 / 578 The train loss 0.450824521837
 531 / 578 The train loss 0.45030312023
 532 / 578 The train loss 0.449939126909
 533 / 578 The train loss 0.449505920583
 534 / 578 The train loss 0.448980396931
 535 / 578 The train loss 0.448433746874
 536 / 578 The train loss 0.447830905544
 537 / 578 The train loss 0.447267708488
 538 / 578 The train loss 0.446757327104
 539 / 578 The train loss 0.446330623763
 540 / 578 The train loss 0.445896283913
 541 / 578 The train loss 0.445327572225
 542 / 578 The train loss 0.444819997972
 543 / 578 The train loss 0.444272355964
 544 / 578 The train loss 0.443866996166
 545 / 578 The train loss 0.443873897786
 546 / 578 The train loss 0.445102473063
 547 / 578 The train loss 0.44659840207
 548 / 578 The train loss 0.449726028997
 549 / 578 The train loss 0.451652904206
 550 / 578 The train loss 0.452931035611
 551 / 578 The train loss 0.454276463909
 552 / 578 The train loss 0.455273691474
 553 / 578 The train loss 0.456240910804
 554 / 578 The train loss 0.457273945337
 555 / 578 The train loss 0.458048053223
 556 / 578 The train loss 0.458607034359
 557 / 578 The train loss 0.459561824655
 558 / 578 The train loss 0.4605354003
 559 / 578 The train loss 0.461394931598
 560 / 578 The train loss 0.462338890737
 561 / 578 The train loss 0.462883279355
 562 / 578 The train loss 0.463303132059
 563 / 578 The train loss 0.463310629411
 564 / 578 The train loss 0.462827418756
 565 / 578 The train loss 0.462178026024
 566 / 578 The train loss 0.461602879501
 567 / 578 The train loss 0.461018924284
 568 / 578 The train loss 0.46039420949
 569 / 578 The train loss 0.459842831292
 570 / 578 The train loss 0.459317657224
 571 / 578 The train loss 0.458700888854
 572 / 578 The train loss 0.458221520646
 573 / 578 The train loss 0.458251775472
 574 / 578 The train loss 0.458140870748
 575 / 578 The train loss 0.457773836064
 576 / 578 The train loss 0.457569834197
 577 / 578 The train loss 0.457373135208
 578 / 578 The train loss 0.457058253112

Starting epoch 12
Validation:
 1 / 30 The valid loss 0.422050267458
 2 / 30 The valid loss 0.399051129818
 3 / 30 The valid loss 0.372101227442
 4 / 30 The valid loss 0.326366946101
 5 / 30 The valid loss 0.294028794765
 6 / 30 The valid loss 0.321254620949
 7 / 30 The valid loss 0.420462702002
 8 / 30 The valid loss 0.553662531078
 9 / 30 The valid loss 0.691204938624
 10 / 30 The valid loss 0.764984196424
 11 / 30 The valid loss 0.78425634991
 12 / 30 The valid loss 0.792947351933
 13 / 30 The valid loss 0.799838749262
 14 / 30 The valid loss 0.803116943155
 15 / 30 The valid loss 0.789502040545
 16 / 30 The valid loss 0.784871142358
 17 / 30 The valid loss 0.798560994513
 18 / 30 The valid loss 0.808650804891
 19 / 30 The valid loss 0.809107460474
 20 / 30 The valid loss 0.804060122371
 21 / 30 The valid loss 0.789191224745
 22 / 30 The valid loss 0.775211353194
 23 / 30 The valid loss 0.763246499974
 24 / 30 The valid loss 0.754111302396
 25 / 30 The valid loss 0.74459536314
 26 / 30 The valid loss 0.739027722524
 27 / 30 The valid loss 0.727914554101
 28 / 30 The valid loss 0.710551854223
 29 / 30 The valid loss 0.692506796327
 30 / 30 The valid loss 0.677911366026

Validation MSE(val_loss): 14.377026435

Test MSE(test_loss): 4.01394664516
Training
 1 / 578 The train loss 0.397657185793
 2 / 578 The train loss 0.250186063349
 3 / 578 The train loss 0.206095623473
 4 / 578 The train loss 0.209104469046
 5 / 578 The train loss 0.204660765827
 6 / 578 The train loss 0.193705922614
 7 / 578 The train loss 0.182711301105
 8 / 578 The train loss 0.170875897631
 9 / 578 The train loss 0.157623284807
 10 / 578 The train loss 0.144187627733
 11 / 578 The train loss 0.134877782654
 12 / 578 The train loss 0.128030076623
 13 / 578 The train loss 0.129204782156
 14 / 578 The train loss 0.13854912562
 15 / 578 The train loss 0.146938010057
 16 / 578 The train loss 0.154336035252
 17 / 578 The train loss 0.157864306779
 18 / 578 The train loss 0.162477221754
 19 / 578 The train loss 0.162005559394
 20 / 578 The train loss 0.158498890325
 21 / 578 The train loss 0.152512376152
 22 / 578 The train loss 0.147582762451
 23 / 578 The train loss 0.143007030144
 24 / 578 The train loss 0.13906727902
 25 / 578 The train loss 0.136860833913
 26 / 578 The train loss 0.134309160165
 27 / 578 The train loss 0.132262888468
 28 / 578 The train loss 0.129267912624
 29 / 578 The train loss 0.125440845073
 30 / 578 The train loss 0.122972811138
 31 / 578 The train loss 0.119885586442
 32 / 578 The train loss 0.117635335773
 33 / 578 The train loss 0.11638285897
 34 / 578 The train loss 0.116734974086
 35 / 578 The train loss 0.116139211825
 36 / 578 The train loss 0.118264459901
 37 / 578 The train loss 0.119883040319
 38 / 578 The train loss 0.122067694209
 39 / 578 The train loss 0.122761299595
 40 / 578 The train loss 0.121195160784
 41 / 578 The train loss 0.119675007353
 42 / 578 The train loss 0.117727831095
 43 / 578 The train loss 0.117650168147
 44 / 578 The train loss 0.118150440383
 45 / 578 The train loss 0.118009034544
 46 / 578 The train loss 0.116676099032
 47 / 578 The train loss 0.115860578307
 48 / 578 The train loss 0.115504991496
 49 / 578 The train loss 0.115360076102
 50 / 578 The train loss 0.115355515257
 51 / 578 The train loss 0.115208607605
 52 / 578 The train loss 0.1151842543
 53 / 578 The train loss 0.114690487607
 54 / 578 The train loss 0.11455221688
 55 / 578 The train loss 0.114273523133
 56 / 578 The train loss 0.113775545931
 57 / 578 The train loss 0.112860738904
 58 / 578 The train loss 0.112522640169
 59 / 578 The train loss 0.113645137177
 60 / 578 The train loss 0.11687908452
 61 / 578 The train loss 0.120624281161
 62 / 578 The train loss 0.124450044586
 63 / 578 The train loss 0.127684973831
 64 / 578 The train loss 0.129533548199
 65 / 578 The train loss 0.129491919336
 66 / 578 The train loss 0.128924825709
 67 / 578 The train loss 0.127959460076
 68 / 578 The train loss 0.126821568291
 69 / 578 The train loss 0.126209449239
 70 / 578 The train loss 0.126062514686
 71 / 578 The train loss 0.126129230301
 72 / 578 The train loss 0.126178424547
 73 / 578 The train loss 0.126050040687
 74 / 578 The train loss 0.127355181637
 75 / 578 The train loss 0.129390708158
 76 / 578 The train loss 0.130252652213
 77 / 578 The train loss 0.131338347814
 78 / 578 The train loss 0.132509865392
 79 / 578 The train loss 0.135449609
 80 / 578 The train loss 0.138424635632
 81 / 578 The train loss 0.141250133468
 82 / 578 The train loss 0.145493341246
 83 / 578 The train loss 0.149586056685
 84 / 578 The train loss 0.153985118183
 85 / 578 The train loss 0.158326156306
 86 / 578 The train loss 0.161421842657
 87 / 578 The train loss 0.165248817711
 88 / 578 The train loss 0.168052456215
 89 / 578 The train loss 0.174546309694
 90 / 578 The train loss 0.179980106776
 91 / 578 The train loss 0.180601149757
 92 / 578 The train loss 0.181851201691
 93 / 578 The train loss 0.182198824381
 94 / 578 The train loss 0.18217646184
 95 / 578 The train loss 0.182708040939
 96 / 578 The train loss 0.184215053877
 97 / 578 The train loss 0.186066089179
 98 / 578 The train loss 0.188045503191
 99 / 578 The train loss 0.190191241489
 100 / 578 The train loss 0.192782753669
 101 / 578 The train loss 0.19525760143
 102 / 578 The train loss 0.196172395137
 103 / 578 The train loss 0.195933169495
 104 / 578 The train loss 0.195740735839
 105 / 578 The train loss 0.196341056448
 106 / 578 The train loss 0.196275232219
 107 / 578 The train loss 0.196119303145
 108 / 578 The train loss 0.195528943195
 109 / 578 The train loss 0.194648736968
 110 / 578 The train loss 0.193533523956
 111 / 578 The train loss 0.193040596506
 112 / 578 The train loss 0.192623055573
 113 / 578 The train loss 0.193258806832
 114 / 578 The train loss 0.19411927844
 115 / 578 The train loss 0.196600190097
 116 / 578 The train loss 0.199427249263
 117 / 578 The train loss 0.2024824519
 118 / 578 The train loss 0.207938061585
 119 / 578 The train loss 0.21345949389
 120 / 578 The train loss 0.218254686054
 121 / 578 The train loss 0.224240299253
 122 / 578 The train loss 0.230137399627
 123 / 578 The train loss 0.238950425683
 124 / 578 The train loss 0.249119714053
 125 / 578 The train loss 0.258220186859
 126 / 578 The train loss 0.267695275239
 127 / 578 The train loss 0.278000881969
 128 / 578 The train loss 0.288388100133
 129 / 578 The train loss 0.299789074462
 130 / 578 The train loss 0.311497289086
 131 / 578 The train loss 0.319620584401
 132 / 578 The train loss 0.323461693034
 133 / 578 The train loss 0.325334945259
 134 / 578 The train loss 0.326652508312
 135 / 578 The train loss 0.328821178857
 136 / 578 The train loss 0.331738460289
 137 / 578 The train loss 0.335975861163
 138 / 578 The train loss 0.337868968349
 139 / 578 The train loss 0.339020631216
 140 / 578 The train loss 0.339233229762
 141 / 578 The train loss 0.338859903395
 142 / 578 The train loss 0.339057858692
 143 / 578 The train loss 0.340589919421
 144 / 578 The train loss 0.342798006333
 145 / 578 The train loss 0.344355053044
 146 / 578 The train loss 0.344589003272
 147 / 578 The train loss 0.344346223361
 148 / 578 The train loss 0.343679396712
 149 / 578 The train loss 0.343177255793
 150 / 578 The train loss 0.342343094572
 151 / 578 The train loss 0.340412666441
 152 / 578 The train loss 0.338922293832
 153 / 578 The train loss 0.337563133099
 154 / 578 The train loss 0.336310240138
 155 / 578 The train loss 0.335607142578
 156 / 578 The train loss 0.334850457784
 157 / 578 The train loss 0.334079389146
 158 / 578 The train loss 0.333330715452
 159 / 578 The train loss 0.332362491637
 160 / 578 The train loss 0.331823685463
 161 / 578 The train loss 0.331012073303
 162 / 578 The train loss 0.329800105274
 163 / 578 The train loss 0.329230241866
 164 / 578 The train loss 0.328230915432
 165 / 578 The train loss 0.326877398541
 166 / 578 The train loss 0.32567484513
 167 / 578 The train loss 0.324942651779
 168 / 578 The train loss 0.33052209319
 169 / 578 The train loss 0.338363555916
 170 / 578 The train loss 0.345968288687
 171 / 578 The train loss 0.353814687101
 172 / 578 The train loss 0.35893998882
 173 / 578 The train loss 0.363720774112
 174 / 578 The train loss 0.368365994965
 175 / 578 The train loss 0.373862251022
 176 / 578 The train loss 0.378935213942
 177 / 578 The train loss 0.381698583963
 178 / 578 The train loss 0.383483602571
 179 / 578 The train loss 0.385788265366
 180 / 578 The train loss 0.387046276675
 181 / 578 The train loss 0.387897242972
 182 / 578 The train loss 0.387539099022
 183 / 578 The train loss 0.388323020231
 184 / 578 The train loss 0.391546343797
 185 / 578 The train loss 0.395872652672
 186 / 578 The train loss 0.405160994119
 187 / 578 The train loss 0.418302595117
 188 / 578 The train loss 0.430552125433
 189 / 578 The train loss 0.440456651959
 190 / 578 The train loss 0.451816431296
 191 / 578 The train loss 0.460925872862
 192 / 578 The train loss 0.468361067721
 193 / 578 The train loss 0.474342539347
 194 / 578 The train loss 0.478525831354
 195 / 578 The train loss 0.480282914161
 196 / 578 The train loss 0.48219567581
 197 / 578 The train loss 0.483954665152
 198 / 578 The train loss 0.483482510435
 199 / 578 The train loss 0.483097547244
 200 / 578 The train loss 0.482218172532
 201 / 578 The train loss 0.481379478707
 202 / 578 The train loss 0.481968746899
 203 / 578 The train loss 0.481187366188
 204 / 578 The train loss 0.480385625023
 205 / 578 The train loss 0.479018752004
 206 / 578 The train loss 0.479390606555
 207 / 578 The train loss 0.480249444194
 208 / 578 The train loss 0.478887586335
 209 / 578 The train loss 0.477973847708
 210 / 578 The train loss 0.476657615868
 211 / 578 The train loss 0.475515895669
 212 / 578 The train loss 0.473779322281
 213 / 578 The train loss 0.472385245537
 214 / 578 The train loss 0.470803348069
 215 / 578 The train loss 0.46996493265
 216 / 578 The train loss 0.468559721019
 217 / 578 The train loss 0.466984132073
 218 / 578 The train loss 0.465636440527
 219 / 578 The train loss 0.46419728743
 220 / 578 The train loss 0.462515417232
 221 / 578 The train loss 0.46173982216
 222 / 578 The train loss 0.461699994784
 223 / 578 The train loss 0.461160625394
 224 / 578 The train loss 0.460402769253
 225 / 578 The train loss 0.46004692629
 226 / 578 The train loss 0.459907727118
 227 / 578 The train loss 0.460032096732
 228 / 578 The train loss 0.460208138304
 229 / 578 The train loss 0.460034811682
 230 / 578 The train loss 0.459324650742
 231 / 578 The train loss 0.459384717418
 232 / 578 The train loss 0.459032221296
 233 / 578 The train loss 0.45830194206
 234 / 578 The train loss 0.457408055679
 235 / 578 The train loss 0.456730682815
 236 / 578 The train loss 0.455586316616
 237 / 578 The train loss 0.4554248646
 238 / 578 The train loss 0.456957711658
 239 / 578 The train loss 0.457409221024
 240 / 578 The train loss 0.457052531109
 241 / 578 The train loss 0.45550364412
 242 / 578 The train loss 0.454803422417
 243 / 578 The train loss 0.455276451192
 244 / 578 The train loss 0.455059809229
 245 / 578 The train loss 0.455169434891
 246 / 578 The train loss 0.454847034839
 247 / 578 The train loss 0.453915338481
 248 / 578 The train loss 0.452887040368
 249 / 578 The train loss 0.452039931807
 250 / 578 The train loss 0.452222208515
 251 / 578 The train loss 0.45279507952
 252 / 578 The train loss 0.453410668993
 253 / 578 The train loss 0.454645771769
 254 / 578 The train loss 0.457574710617
 255 / 578 The train loss 0.460617306668
 256 / 578 The train loss 0.463721896071
 257 / 578 The train loss 0.46665613059
 258 / 578 The train loss 0.468967856377
 259 / 578 The train loss 0.469419348669
 260 / 578 The train loss 0.470158731493
 261 / 578 The train loss 0.470120902277
 262 / 578 The train loss 0.47132879861
 263 / 578 The train loss 0.471241575244
 264 / 578 The train loss 0.471592862512
 265 / 578 The train loss 0.472229275178
 266 / 578 The train loss 0.473323145545
 267 / 578 The train loss 0.474128300122
 268 / 578 The train loss 0.474863348181
 269 / 578 The train loss 0.476245870835
 270 / 578 The train loss 0.478979278949
 271 / 578 The train loss 0.481004422455
 272 / 578 The train loss 0.48114888722
 273 / 578 The train loss 0.480596193587
 274 / 578 The train loss 0.479538485748
 275 / 578 The train loss 0.478534238948
 276 / 578 The train loss 0.477611620533
 277 / 578 The train loss 0.47699574837
 278 / 578 The train loss 0.476357903908
 279 / 578 The train loss 0.47523448866
 280 / 578 The train loss 0.474494992316
 281 / 578 The train loss 0.473786837385
 282 / 578 The train loss 0.473134444234
 283 / 578 The train loss 0.473303204409
 284 / 578 The train loss 0.473156736783
 285 / 578 The train loss 0.473196998513
 286 / 578 The train loss 0.473224675231
 287 / 578 The train loss 0.472985782807
 288 / 578 The train loss 0.473458697722
 289 / 578 The train loss 0.473893286151
 290 / 578 The train loss 0.474229873164
 291 / 578 The train loss 0.474428613072
 292 / 578 The train loss 0.475084784823
 293 / 578 The train loss 0.475118619603
 294 / 578 The train loss 0.475088335333
 295 / 578 The train loss 0.474496528367
 296 / 578 The train loss 0.474013005169
 297 / 578 The train loss 0.473340784773
 298 / 578 The train loss 0.472842870455
 299 / 578 The train loss 0.473371653565
 300 / 578 The train loss 0.473871631734
 301 / 578 The train loss 0.474628258299
 302 / 578 The train loss 0.475321510486
 303 / 578 The train loss 0.475305192062
 304 / 578 The train loss 0.475473237187
 305 / 578 The train loss 0.476230671477
 306 / 578 The train loss 0.477563414224
 307 / 578 The train loss 0.478299487406
 308 / 578 The train loss 0.480062443823
 309 / 578 The train loss 0.481573297284
 310 / 578 The train loss 0.481781575288
 311 / 578 The train loss 0.482264273144
 312 / 578 The train loss 0.482008210945
 313 / 578 The train loss 0.481264271211
 314 / 578 The train loss 0.480672176583
 315 / 578 The train loss 0.480420806874
 316 / 578 The train loss 0.48090508902
 317 / 578 The train loss 0.482146935708
 318 / 578 The train loss 0.482289752882
 319 / 578 The train loss 0.482303041474
 320 / 578 The train loss 0.481912113365
 321 / 578 The train loss 0.481000587634
 322 / 578 The train loss 0.479659508162
 323 / 578 The train loss 0.478259428494
 324 / 578 The train loss 0.477058423189
 325 / 578 The train loss 0.476517377301
 326 / 578 The train loss 0.476205802345
 327 / 578 The train loss 0.475471286359
 328 / 578 The train loss 0.47441676635
 329 / 578 The train loss 0.473435657037
 330 / 578 The train loss 0.472203625795
 331 / 578 The train loss 0.470980850535
 332 / 578 The train loss 0.469838154816
 333 / 578 The train loss 0.46901890433
 334 / 578 The train loss 0.468666970942
 335 / 578 The train loss 0.468694071305
 336 / 578 The train loss 0.467599445632
 337 / 578 The train loss 0.466349680308
 338 / 578 The train loss 0.46534567215
 339 / 578 The train loss 0.464407113878
 340 / 578 The train loss 0.463198259396
 341 / 578 The train loss 0.462186669117
 342 / 578 The train loss 0.461026229367
 343 / 578 The train loss 0.460073541329
 344 / 578 The train loss 0.459201383004
 345 / 578 The train loss 0.458358075578
 346 / 578 The train loss 0.457485844817
 347 / 578 The train loss 0.457026395753
 348 / 578 The train loss 0.456823345413
 349 / 578 The train loss 0.456593612641
 350 / 578 The train loss 0.456088381165
 351 / 578 The train loss 0.45565253251
 352 / 578 The train loss 0.455263823705
 353 / 578 The train loss 0.454946925901
 354 / 578 The train loss 0.454875024962
 355 / 578 The train loss 0.454959444271
 356 / 578 The train loss 0.45503313581
 357 / 578 The train loss 0.454741919251
 358 / 578 The train loss 0.454595840853
 359 / 578 The train loss 0.454245167533
 360 / 578 The train loss 0.4540631498
 361 / 578 The train loss 0.453616133285
 362 / 578 The train loss 0.45331053972
 363 / 578 The train loss 0.453011410798
 364 / 578 The train loss 0.452029048649
 365 / 578 The train loss 0.45094328538
 366 / 578 The train loss 0.450031384001
 367 / 578 The train loss 0.448869141083
 368 / 578 The train loss 0.447832617398
 369 / 578 The train loss 0.447112890011
 370 / 578 The train loss 0.446421442216
 371 / 578 The train loss 0.445722323117
 372 / 578 The train loss 0.445321387957
 373 / 578 The train loss 0.444798916953
 374 / 578 The train loss 0.444393555086
 375 / 578 The train loss 0.444221566573
 376 / 578 The train loss 0.444010859212
 377 / 578 The train loss 0.443514047685
 378 / 578 The train loss 0.44298370527
 379 / 578 The train loss 0.442213129611
 380 / 578 The train loss 0.441355346315
 381 / 578 The train loss 0.440900962668
 382 / 578 The train loss 0.440906015771
 383 / 578 The train loss 0.440529060985
 384 / 578 The train loss 0.439745335427
 385 / 578 The train loss 0.438849309082
 386 / 578 The train loss 0.438270663539
 387 / 578 The train loss 0.437934947756
 388 / 578 The train loss 0.437735361186
 389 / 578 The train loss 0.437492525856
 390 / 578 The train loss 0.436909318233
 391 / 578 The train loss 0.436363263084
 392 / 578 The train loss 0.435587002834
 393 / 578 The train loss 0.434804102728
 394 / 578 The train loss 0.433995571077
 395 / 578 The train loss 0.433112037554
 396 / 578 The train loss 0.432674300335
 397 / 578 The train loss 0.432339373999
 398 / 578 The train loss 0.432129187495
 399 / 578 The train loss 0.431829637598
 400 / 578 The train loss 0.431400249447
 401 / 578 The train loss 0.430862991139
 402 / 578 The train loss 0.430253378041
 403 / 578 The train loss 0.429861095935
 404 / 578 The train loss 0.429755312524
 405 / 578 The train loss 0.42963350481
 406 / 578 The train loss 0.429267417984
 407 / 578 The train loss 0.428800047611
 408 / 578 The train loss 0.42817552106
 409 / 578 The train loss 0.429657600724
 410 / 578 The train loss 0.432446134776
 411 / 578 The train loss 0.434231899765
 412 / 578 The train loss 0.434827414635
 413 / 578 The train loss 0.435230204141
 414 / 578 The train loss 0.436170608629
 415 / 578 The train loss 0.436991723003
 416 / 578 The train loss 0.437020347843
 417 / 578 The train loss 0.436929962363
 418 / 578 The train loss 0.436436984915
 419 / 578 The train loss 0.435914646394
 420 / 578 The train loss 0.435834465916
 421 / 578 The train loss 0.435789407569
 422 / 578 The train loss 0.435736507244
 423 / 578 The train loss 0.436203213796
 424 / 578 The train loss 0.436526531384
 425 / 578 The train loss 0.436674539548
 426 / 578 The train loss 0.436408019047
 427 / 578 The train loss 0.43584154996
 428 / 578 The train loss 0.435149345682
 429 / 578 The train loss 0.434476199853
 430 / 578 The train loss 0.433593714961
 431 / 578 The train loss 0.432702793526
 432 / 578 The train loss 0.431754028226
 433 / 578 The train loss 0.4308638155
 434 / 578 The train loss 0.430273747363
 435 / 578 The train loss 0.429751389097
 436 / 578 The train loss 0.429327022635
 437 / 578 The train loss 0.428792316432
 438 / 578 The train loss 0.428682481632
 439 / 578 The train loss 0.428587880698
 440 / 578 The train loss 0.428369938083
 441 / 578 The train loss 0.428010456931
 442 / 578 The train loss 0.42751681465
 443 / 578 The train loss 0.427083531542
 444 / 578 The train loss 0.427165533215
 445 / 578 The train loss 0.42736810671
 446 / 578 The train loss 0.428025018983
 447 / 578 The train loss 0.428497164683
 448 / 578 The train loss 0.428676994892
 449 / 578 The train loss 0.428600213617
 450 / 578 The train loss 0.428649980997
 451 / 578 The train loss 0.428821788634
 452 / 578 The train loss 0.429158447566
 453 / 578 The train loss 0.429244916541
 454 / 578 The train loss 0.429076636128
 455 / 578 The train loss 0.428828568552
 456 / 578 The train loss 0.428398012372
 457 / 578 The train loss 0.427931772712
 458 / 578 The train loss 0.427399759966
 459 / 578 The train loss 0.426878912687
 460 / 578 The train loss 0.426245465591
 461 / 578 The train loss 0.425577383812
 462 / 578 The train loss 0.425041914139
 463 / 578 The train loss 0.425259565713
 464 / 578 The train loss 0.425571146734
 465 / 578 The train loss 0.425266450539
 466 / 578 The train loss 0.424627709374
 467 / 578 The train loss 0.42390664489
 468 / 578 The train loss 0.423281349998
 469 / 578 The train loss 0.422914106192
 470 / 578 The train loss 0.422525647084
 471 / 578 The train loss 0.421973892671
 472 / 578 The train loss 0.42144396894
 473 / 578 The train loss 0.420982652334
 474 / 578 The train loss 0.420460957979
 475 / 578 The train loss 0.419969568653
 476 / 578 The train loss 0.419332949092
 477 / 578 The train loss 0.418817794493
 478 / 578 The train loss 0.418110134548
 479 / 578 The train loss 0.417553712522
 480 / 578 The train loss 0.417335063211
 481 / 578 The train loss 0.417126360522
 482 / 578 The train loss 0.416839659021
 483 / 578 The train loss 0.416522395652
 484 / 578 The train loss 0.416212145447
 485 / 578 The train loss 0.415759266091
 486 / 578 The train loss 0.415269146662
 487 / 578 The train loss 0.414777584626
 488 / 578 The train loss 0.414258660833
 489 / 578 The train loss 0.413671880069
 490 / 578 The train loss 0.41312964646
 491 / 578 The train loss 0.412696125756
 492 / 578 The train loss 0.412312660692
 493 / 578 The train loss 0.412149949978
 494 / 578 The train loss 0.411836054093
 495 / 578 The train loss 0.411546344882
 496 / 578 The train loss 0.411611503163
 497 / 578 The train loss 0.411789303567
 498 / 578 The train loss 0.411962662708
 499 / 578 The train loss 0.412059814148
 500 / 578 The train loss 0.411967699759
 501 / 578 The train loss 0.411653221419
 502 / 578 The train loss 0.411352367864
 503 / 578 The train loss 0.411134704555
 504 / 578 The train loss 0.411037524652
 505 / 578 The train loss 0.410706998032
 506 / 578 The train loss 0.41029244263
 507 / 578 The train loss 0.409947903632
 508 / 578 The train loss 0.409541352843
 509 / 578 The train loss 0.408921810485
 510 / 578 The train loss 0.408337540903
 511 / 578 The train loss 0.407788895238
 512 / 578 The train loss 0.407201651156
 513 / 578 The train loss 0.406692936835
 514 / 578 The train loss 0.406108912856
 515 / 578 The train loss 0.405511122805
 516 / 578 The train loss 0.405053233894
 517 / 578 The train loss 0.404803438714
 518 / 578 The train loss 0.404367945693
 519 / 578 The train loss 0.403886721981
 520 / 578 The train loss 0.403484141175
 521 / 578 The train loss 0.402910339033
 522 / 578 The train loss 0.402226862937
 523 / 578 The train loss 0.401642845619
 524 / 578 The train loss 0.40118029398
 525 / 578 The train loss 0.400801593718
 526 / 578 The train loss 0.400390732679
 527 / 578 The train loss 0.400008525127
 528 / 578 The train loss 0.399665656923
 529 / 578 The train loss 0.399375945724
 530 / 578 The train loss 0.399220336327
 531 / 578 The train loss 0.398928100303
 532 / 578 The train loss 0.39862735239
 533 / 578 The train loss 0.398332733766
 534 / 578 The train loss 0.397913179371
 535 / 578 The train loss 0.397354168842
 536 / 578 The train loss 0.396791152656
 537 / 578 The train loss 0.396276591918
 538 / 578 The train loss 0.39588007288
 539 / 578 The train loss 0.395498810277
 540 / 578 The train loss 0.395075033895
 541 / 578 The train loss 0.39475003371
 542 / 578 The train loss 0.394496576331
 543 / 578 The train loss 0.394051312865
 544 / 578 The train loss 0.393769727924
 545 / 578 The train loss 0.393886181644
 546 / 578 The train loss 0.395013360545
 547 / 578 The train loss 0.396783141607
 548 / 578 The train loss 0.40000007515
 549 / 578 The train loss 0.402007461693
 550 / 578 The train loss 0.40328235996
 551 / 578 The train loss 0.404372443535
 552 / 578 The train loss 0.40501066709
 553 / 578 The train loss 0.40573473338
 554 / 578 The train loss 0.406523619869
 555 / 578 The train loss 0.407152899872
 556 / 578 The train loss 0.407509473494
 557 / 578 The train loss 0.407839155296
 558 / 578 The train loss 0.408442205439
 559 / 578 The train loss 0.408822887925
 560 / 578 The train loss 0.408912352965
 561 / 578 The train loss 0.408935909085
 562 / 578 The train loss 0.408954013086
 563 / 578 The train loss 0.40879147611
 564 / 578 The train loss 0.408322072449
 565 / 578 The train loss 0.407909708305
 566 / 578 The train loss 0.407542073882
 567 / 578 The train loss 0.407281169382
 568 / 578 The train loss 0.406905572481
 569 / 578 The train loss 0.406452333506
 570 / 578 The train loss 0.40592790595
 571 / 578 The train loss 0.405445362466
 572 / 578 The train loss 0.404979709705
 573 / 578 The train loss 0.404891237838
 574 / 578 The train loss 0.404667094285
 575 / 578 The train loss 0.404456266877
 576 / 578 The train loss 0.404354415459
 577 / 578 The train loss 0.404115185619
 578 / 578 The train loss 0.403742483421

Starting epoch 13
Validation:
 1 / 30 The valid loss 0.410781592131
 2 / 30 The valid loss 0.380826771259
 3 / 30 The valid loss 0.347154537837
 4 / 30 The valid loss 0.29259512201
 5 / 30 The valid loss 0.259649071097
 6 / 30 The valid loss 0.283547294637
 7 / 30 The valid loss 0.378085002303
 8 / 30 The valid loss 0.509076507762
 9 / 30 The valid loss 0.644111211101
 10 / 30 The valid loss 0.716082812846
 11 / 30 The valid loss 0.744709037922
 12 / 30 The valid loss 0.756708384802
 13 / 30 The valid loss 0.764705631595
 14 / 30 The valid loss 0.768859655729
 15 / 30 The valid loss 0.757852213581
 16 / 30 The valid loss 0.754644748755
 17 / 30 The valid loss 0.767736765392
 18 / 30 The valid loss 0.777108951575
 19 / 30 The valid loss 0.777215839217
 20 / 30 The valid loss 0.773005921394
 21 / 30 The valid loss 0.757276513037
 22 / 30 The valid loss 0.74174421687
 23 / 30 The valid loss 0.728312975039
 24 / 30 The valid loss 0.717687033738
 25 / 30 The valid loss 0.705975443721
 26 / 30 The valid loss 0.69834116044
 27 / 30 The valid loss 0.686430512203
 28 / 30 The valid loss 0.66986360667
 29 / 30 The valid loss 0.653493264112
 30 / 30 The valid loss 0.640425000091

Validation MSE(val_loss): 13.5820222113

Test MSE(test_loss): 3.57910447677
Training
 1 / 578 The train loss 0.368470460176
 2 / 578 The train loss 0.240226142108
 3 / 578 The train loss 0.202078978221
 4 / 578 The train loss 0.21113075316
 5 / 578 The train loss 0.213841882348
 6 / 578 The train loss 0.20675902317
 7 / 578 The train loss 0.193302235433
 8 / 578 The train loss 0.174564985093
 9 / 578 The train loss 0.161717348215
 10 / 578 The train loss 0.149543242529
 11 / 578 The train loss 0.140501940792
 12 / 578 The train loss 0.132322738568
 13 / 578 The train loss 0.131736484285
 14 / 578 The train loss 0.147245638605
 15 / 578 The train loss 0.157572737833
 16 / 578 The train loss 0.167349963915
 17 / 578 The train loss 0.169911931105
 18 / 578 The train loss 0.173575416621
 19 / 578 The train loss 0.172687494833
 20 / 578 The train loss 0.166697440296
 21 / 578 The train loss 0.161446570463
 22 / 578 The train loss 0.155868079683
 23 / 578 The train loss 0.153721928111
 24 / 578 The train loss 0.152342348825
 25 / 578 The train loss 0.150446001142
 26 / 578 The train loss 0.149909641164
 27 / 578 The train loss 0.14837535181
 28 / 578 The train loss 0.144844022979
 29 / 578 The train loss 0.141039618387
 30 / 578 The train loss 0.139950921386
 31 / 578 The train loss 0.137995934534
 32 / 578 The train loss 0.135384467547
 33 / 578 The train loss 0.134979814404
 34 / 578 The train loss 0.133930447895
 35 / 578 The train loss 0.132873545481
 36 / 578 The train loss 0.132060709409
 37 / 578 The train loss 0.131840285317
 38 / 578 The train loss 0.131493933009
 39 / 578 The train loss 0.129769840397
 40 / 578 The train loss 0.127396366466
 41 / 578 The train loss 0.124876934263
 42 / 578 The train loss 0.122598890526
 43 / 578 The train loss 0.12145129379
 44 / 578 The train loss 0.12096267274
 45 / 578 The train loss 0.120125612575
 46 / 578 The train loss 0.118844224343
 47 / 578 The train loss 0.117897455164
 48 / 578 The train loss 0.118260172312
 49 / 578 The train loss 0.11728263749
 50 / 578 The train loss 0.117441848777
 51 / 578 The train loss 0.117484884337
 52 / 578 The train loss 0.11739296004
 53 / 578 The train loss 0.116555299645
 54 / 578 The train loss 0.115896022396
 55 / 578 The train loss 0.114726778831
 56 / 578 The train loss 0.113228443578
 57 / 578 The train loss 0.112218254785
 58 / 578 The train loss 0.111649209601
 59 / 578 The train loss 0.112689692117
 60 / 578 The train loss 0.116480102173
 61 / 578 The train loss 0.120579980955
 62 / 578 The train loss 0.123838492489
 63 / 578 The train loss 0.126681078551
 64 / 578 The train loss 0.127825316304
 65 / 578 The train loss 0.127323560216
 66 / 578 The train loss 0.126646268125
 67 / 578 The train loss 0.12628945522
 68 / 578 The train loss 0.126057890655
 69 / 578 The train loss 0.125139804449
 70 / 578 The train loss 0.123836037756
 71 / 578 The train loss 0.123026938007
 72 / 578 The train loss 0.12217493689
 73 / 578 The train loss 0.121592153961
 74 / 578 The train loss 0.122674838438
 75 / 578 The train loss 0.125733399714
 76 / 578 The train loss 0.127193206358
 77 / 578 The train loss 0.128002621258
 78 / 578 The train loss 0.12897403703
 79 / 578 The train loss 0.132316755838
 80 / 578 The train loss 0.136118912813
 81 / 578 The train loss 0.140677995267
 82 / 578 The train loss 0.147249091803
 83 / 578 The train loss 0.152919557551
 84 / 578 The train loss 0.158267390981
 85 / 578 The train loss 0.161326116784
 86 / 578 The train loss 0.163272715693
 87 / 578 The train loss 0.166061095572
 88 / 578 The train loss 0.169324901738
 89 / 578 The train loss 0.173951494999
 90 / 578 The train loss 0.17691302641
 91 / 578 The train loss 0.176551133005
 92 / 578 The train loss 0.17744354023
 93 / 578 The train loss 0.176924641675
 94 / 578 The train loss 0.176600078577
 95 / 578 The train loss 0.176674214888
 96 / 578 The train loss 0.177128637772
 97 / 578 The train loss 0.179626753285
 98 / 578 The train loss 0.181743714217
 99 / 578 The train loss 0.183342154825
 100 / 578 The train loss 0.185201155785
 101 / 578 The train loss 0.186544646076
 102 / 578 The train loss 0.187485424564
 103 / 578 The train loss 0.188029513143
 104 / 578 The train loss 0.188784627333
 105 / 578 The train loss 0.189977310332
 106 / 578 The train loss 0.191112288274
 107 / 578 The train loss 0.191412267232
 108 / 578 The train loss 0.191574773389
 109 / 578 The train loss 0.19090749537
 110 / 578 The train loss 0.189652082984
 111 / 578 The train loss 0.188541388643
 112 / 578 The train loss 0.187496015918
 113 / 578 The train loss 0.186834448784
 114 / 578 The train loss 0.186732810366
 115 / 578 The train loss 0.187976458329
 116 / 578 The train loss 0.190144079047
 117 / 578 The train loss 0.192475423392
 118 / 578 The train loss 0.196370578971
 119 / 578 The train loss 0.201201737475
 120 / 578 The train loss 0.205335644927
 121 / 578 The train loss 0.210228489496
 122 / 578 The train loss 0.215577308318
 123 / 578 The train loss 0.223523606175
 124 / 578 The train loss 0.235018929211
 125 / 578 The train loss 0.245604332015
 126 / 578 The train loss 0.254483878657
 127 / 578 The train loss 0.263298129516
 128 / 578 The train loss 0.274028199769
 129 / 578 The train loss 0.285878331206
 130 / 578 The train loss 0.297982630406
 131 / 578 The train loss 0.307149654169
 132 / 578 The train loss 0.310872359752
 133 / 578 The train loss 0.312472043671
 134 / 578 The train loss 0.314189815296
 135 / 578 The train loss 0.315929490148
 136 / 578 The train loss 0.318293765667
 137 / 578 The train loss 0.321717971909
 138 / 578 The train loss 0.32308597802
 139 / 578 The train loss 0.323994732629
 140 / 578 The train loss 0.324328554142
 141 / 578 The train loss 0.324157901095
 142 / 578 The train loss 0.32417263955
 143 / 578 The train loss 0.325466491582
 144 / 578 The train loss 0.325885422774
 145 / 578 The train loss 0.326041490906
 146 / 578 The train loss 0.32650279329
 147 / 578 The train loss 0.326135412403
 148 / 578 The train loss 0.325233964841
 149 / 578 The train loss 0.324434713867
 150 / 578 The train loss 0.323398213126
 151 / 578 The train loss 0.321818727075
 152 / 578 The train loss 0.320205955852
 153 / 578 The train loss 0.318724684308
 154 / 578 The train loss 0.317528051252
 155 / 578 The train loss 0.316450177121
 156 / 578 The train loss 0.315533530672
 157 / 578 The train loss 0.314584240068
 158 / 578 The train loss 0.313894912164
 159 / 578 The train loss 0.312973322675
 160 / 578 The train loss 0.311917337531
 161 / 578 The train loss 0.310680333444
 162 / 578 The train loss 0.309418608274
 163 / 578 The train loss 0.30923833866
 164 / 578 The train loss 0.308071093577
 165 / 578 The train loss 0.306906773906
 166 / 578 The train loss 0.30594904594
 167 / 578 The train loss 0.305117566507
 168 / 578 The train loss 0.308374631341
 169 / 578 The train loss 0.313147214115
 170 / 578 The train loss 0.318272790565
 171 / 578 The train loss 0.322251025075
 172 / 578 The train loss 0.325355015598
 173 / 578 The train loss 0.329080555211
 174 / 578 The train loss 0.332454708533
 175 / 578 The train loss 0.336157878201
 176 / 578 The train loss 0.340075199312
 177 / 578 The train loss 0.341731716758
 178 / 578 The train loss 0.343620049223
 179 / 578 The train loss 0.345874241017
 180 / 578 The train loss 0.347423365785
 181 / 578 The train loss 0.347832251392
 182 / 578 The train loss 0.347367215687
 183 / 578 The train loss 0.346868910659
 184 / 578 The train loss 0.349202405084
 185 / 578 The train loss 0.351698558627
 186 / 578 The train loss 0.353659897953
 187 / 578 The train loss 0.356766317528
 188 / 578 The train loss 0.363754112581
 189 / 578 The train loss 0.371446470646
 190 / 578 The train loss 0.380208492191
 191 / 578 The train loss 0.386867392038
 192 / 578 The train loss 0.391591900251
 193 / 578 The train loss 0.395701319356
 194 / 578 The train loss 0.398569503309
 195 / 578 The train loss 0.399230638137
 196 / 578 The train loss 0.39890955532
 197 / 578 The train loss 0.398208997491
 198 / 578 The train loss 0.398340224232
 199 / 578 The train loss 0.401020193435
 200 / 578 The train loss 0.402352152681
 201 / 578 The train loss 0.402779023066
 202 / 578 The train loss 0.405062416615
 203 / 578 The train loss 0.405900450974
 204 / 578 The train loss 0.406086898235
 205 / 578 The train loss 0.405033683404
 206 / 578 The train loss 0.404556144996
 207 / 578 The train loss 0.404106909013
 208 / 578 The train loss 0.403325649972
 209 / 578 The train loss 0.402974727327
 210 / 578 The train loss 0.401874604529
 211 / 578 The train loss 0.400387943347
 212 / 578 The train loss 0.399208349712
 213 / 578 The train loss 0.398165355327
 214 / 578 The train loss 0.396858804267
 215 / 578 The train loss 0.396071393814
 216 / 578 The train loss 0.395621057591
 217 / 578 The train loss 0.395238186074
 218 / 578 The train loss 0.394288208566
 219 / 578 The train loss 0.393919802071
 220 / 578 The train loss 0.393051222839
 221 / 578 The train loss 0.393026731731
 222 / 578 The train loss 0.393711920584
 223 / 578 The train loss 0.394996484625
 224 / 578 The train loss 0.394944845624
 225 / 578 The train loss 0.395313961829
 226 / 578 The train loss 0.396802184123
 227 / 578 The train loss 0.398343676307
 228 / 578 The train loss 0.39905770181
 229 / 578 The train loss 0.399930203643
 230 / 578 The train loss 0.400705321255
 231 / 578 The train loss 0.40182058693
 232 / 578 The train loss 0.403006141612
 233 / 578 The train loss 0.403982794136
 234 / 578 The train loss 0.404533930115
 235 / 578 The train loss 0.404869738213
 236 / 578 The train loss 0.404525115257
 237 / 578 The train loss 0.405557755357
 238 / 578 The train loss 0.409781230277
 239 / 578 The train loss 0.413045765146
 240 / 578 The train loss 0.415918808361
 241 / 578 The train loss 0.416433590259
 242 / 578 The train loss 0.415795637487
 243 / 578 The train loss 0.415616537249
 244 / 578 The train loss 0.415240614438
 245 / 578 The train loss 0.415876147022
 246 / 578 The train loss 0.416836436013
 247 / 578 The train loss 0.416183716184
 248 / 578 The train loss 0.415374498867
 249 / 578 The train loss 0.414583109162
 250 / 578 The train loss 0.414727124028
 251 / 578 The train loss 0.414973489369
 252 / 578 The train loss 0.415232674285
 253 / 578 The train loss 0.416311037686
 254 / 578 The train loss 0.419333450254
 255 / 578 The train loss 0.421887964178
 256 / 578 The train loss 0.426063996558
 257 / 578 The train loss 0.429056847888
 258 / 578 The train loss 0.431172465913
 259 / 578 The train loss 0.431362756251
 260 / 578 The train loss 0.432385800714
 261 / 578 The train loss 0.433228900884
 262 / 578 The train loss 0.433361540364
 263 / 578 The train loss 0.433153050077
 264 / 578 The train loss 0.433565858236
 265 / 578 The train loss 0.434857539235
 266 / 578 The train loss 0.435388501026
 267 / 578 The train loss 0.435697750478
 268 / 578 The train loss 0.436227556089
 269 / 578 The train loss 0.43727289062
 270 / 578 The train loss 0.440015284048
 271 / 578 The train loss 0.440947430208
 272 / 578 The train loss 0.44085622496
 273 / 578 The train loss 0.439961406167
 274 / 578 The train loss 0.439337638764
 275 / 578 The train loss 0.438378827782
 276 / 578 The train loss 0.437600249067
 277 / 578 The train loss 0.436404475929
 278 / 578 The train loss 0.43532189753
 279 / 578 The train loss 0.434219242364
 280 / 578 The train loss 0.433503973318
 281 / 578 The train loss 0.433350689305
 282 / 578 The train loss 0.432272228148
 283 / 578 The train loss 0.431929219205
 284 / 578 The train loss 0.431509053512
 285 / 578 The train loss 0.430863624491
 286 / 578 The train loss 0.430173263729
 287 / 578 The train loss 0.429731763244
 288 / 578 The train loss 0.429501543384
 289 / 578 The train loss 0.429304465748
 290 / 578 The train loss 0.429307364743
 291 / 578 The train loss 0.42937213996
 292 / 578 The train loss 0.429609916026
 293 / 578 The train loss 0.429268981861
 294 / 578 The train loss 0.42876240575
 295 / 578 The train loss 0.42810884865
 296 / 578 The train loss 0.427532517823
 297 / 578 The train loss 0.426723695257
 298 / 578 The train loss 0.425927220269
 299 / 578 The train loss 0.425302847267
 300 / 578 The train loss 0.42555860032
 301 / 578 The train loss 0.425626839112
 302 / 578 The train loss 0.42595609568
 303 / 578 The train loss 0.426642945885
 304 / 578 The train loss 0.42709998727
 305 / 578 The train loss 0.427611013703
 306 / 578 The train loss 0.428163291381
 307 / 578 The train loss 0.427917839644
 308 / 578 The train loss 0.427721996774
 309 / 578 The train loss 0.427534290785
 310 / 578 The train loss 0.426957224496
 311 / 578 The train loss 0.42623634171
 312 / 578 The train loss 0.425504524637
 313 / 578 The train loss 0.424697590516
 314 / 578 The train loss 0.423768960456
 315 / 578 The train loss 0.42333622631
 316 / 578 The train loss 0.424189880579
 317 / 578 The train loss 0.425672050618
 318 / 578 The train loss 0.425615166337
 319 / 578 The train loss 0.425294019486
 320 / 578 The train loss 0.424604065897
 321 / 578 The train loss 0.423450957467
 322 / 578 The train loss 0.422378498211
 323 / 578 The train loss 0.421583413668
 324 / 578 The train loss 0.421298548263
 325 / 578 The train loss 0.421604963134
 326 / 578 The train loss 0.421763858598
 327 / 578 The train loss 0.421067264199
 328 / 578 The train loss 0.420072936845
 329 / 578 The train loss 0.419251731696
 330 / 578 The train loss 0.418379023857
 331 / 578 The train loss 0.417438168861
 332 / 578 The train loss 0.41677360493
 333 / 578 The train loss 0.416361320892
 334 / 578 The train loss 0.415483206605
 335 / 578 The train loss 0.415068831556
 336 / 578 The train loss 0.414106543226
 337 / 578 The train loss 0.413195310393
 338 / 578 The train loss 0.412211894664
 339 / 578 The train loss 0.41139348698
 340 / 578 The train loss 0.410815188667
 341 / 578 The train loss 0.409826573722
 342 / 578 The train loss 0.408782622363
 343 / 578 The train loss 0.407821656865
 344 / 578 The train loss 0.407031787153
 345 / 578 The train loss 0.406087424817
 346 / 578 The train loss 0.405205922842
 347 / 578 The train loss 0.404699397539
 348 / 578 The train loss 0.404297301059
 349 / 578 The train loss 0.40424422472
 350 / 578 The train loss 0.403663597197
 351 / 578 The train loss 0.403054679821
 352 / 578 The train loss 0.402568201159
 353 / 578 The train loss 0.402436727384
 354 / 578 The train loss 0.40262184562
 355 / 578 The train loss 0.403176743259
 356 / 578 The train loss 0.403224579032
 357 / 578 The train loss 0.402910417666
 358 / 578 The train loss 0.403478119756
 359 / 578 The train loss 0.40387989963
 360 / 578 The train loss 0.404213671019
 361 / 578 The train loss 0.404205834235
 362 / 578 The train loss 0.404215450327
 363 / 578 The train loss 0.404246179736
 364 / 578 The train loss 0.403653485562
 365 / 578 The train loss 0.402712155843
 366 / 578 The train loss 0.402096182907
 367 / 578 The train loss 0.401186290799
 368 / 578 The train loss 0.400274171094
 369 / 578 The train loss 0.399427450189
 370 / 578 The train loss 0.398761954969
 371 / 578 The train loss 0.398114576479
 372 / 578 The train loss 0.397349608987
 373 / 578 The train loss 0.39665839498
 374 / 578 The train loss 0.396058662834
 375 / 578 The train loss 0.395552614118
 376 / 578 The train loss 0.395212727723
 377 / 578 The train loss 0.394564205425
 378 / 578 The train loss 0.393984759714
 379 / 578 The train loss 0.393118625055
 380 / 578 The train loss 0.392334773083
 381 / 578 The train loss 0.391913258893
 382 / 578 The train loss 0.391768723185
 383 / 578 The train loss 0.391343929326
 384 / 578 The train loss 0.390610533073
 385 / 578 The train loss 0.389784922465
 386 / 578 The train loss 0.389209761796
 387 / 578 The train loss 0.388549007961
 388 / 578 The train loss 0.387949501966
 389 / 578 The train loss 0.38723041177
 390 / 578 The train loss 0.386595773874
 391 / 578 The train loss 0.385839619243
 392 / 578 The train loss 0.385062320277
 393 / 578 The train loss 0.384190180521
 394 / 578 The train loss 0.383344701595
 395 / 578 The train loss 0.382465804646
 396 / 578 The train loss 0.381869553885
 397 / 578 The train loss 0.381247117857
 398 / 578 The train loss 0.380602091724
 399 / 578 The train loss 0.379939388797
 400 / 578 The train loss 0.379175778111
 401 / 578 The train loss 0.378395464973
 402 / 578 The train loss 0.377651215438
 403 / 578 The train loss 0.376900068034
 404 / 578 The train loss 0.376158474626
 405 / 578 The train loss 0.375488745913
 406 / 578 The train loss 0.374878402419
 407 / 578 The train loss 0.37408643923
 408 / 578 The train loss 0.373475853687
 409 / 578 The train loss 0.374620825822
 410 / 578 The train loss 0.376488429131
 411 / 578 The train loss 0.377536166147
 412 / 578 The train loss 0.377639400262
 413 / 578 The train loss 0.377065869535
 414 / 578 The train loss 0.376593370661
 415 / 578 The train loss 0.376157604252
 416 / 578 The train loss 0.375425557395
 417 / 578 The train loss 0.374807540341
 418 / 578 The train loss 0.374083228378
 419 / 578 The train loss 0.373408493895
 420 / 578 The train loss 0.372650667258
 421 / 578 The train loss 0.37204416609
 422 / 578 The train loss 0.371624880453
 423 / 578 The train loss 0.371307104223
 424 / 578 The train loss 0.370600391562
 425 / 578 The train loss 0.369938192994
 426 / 578 The train loss 0.369282567441
 427 / 578 The train loss 0.36865981359
 428 / 578 The train loss 0.367949693433
 429 / 578 The train loss 0.367283533916
 430 / 578 The train loss 0.366474062252
 431 / 578 The train loss 0.365693008951
 432 / 578 The train loss 0.36499759927
 433 / 578 The train loss 0.364606268786
 434 / 578 The train loss 0.364906115977
 435 / 578 The train loss 0.364987794451
 436 / 578 The train loss 0.364913420696
 437 / 578 The train loss 0.364403903241
 438 / 578 The train loss 0.363738247704
 439 / 578 The train loss 0.36323287212
 440 / 578 The train loss 0.362700574655
 441 / 578 The train loss 0.36204829423
 442 / 578 The train loss 0.361348723154
 443 / 578 The train loss 0.36059705771
 444 / 578 The train loss 0.360200229655
 445 / 578 The train loss 0.360166361283
 446 / 578 The train loss 0.36017664765
 447 / 578 The train loss 0.35994709684
 448 / 578 The train loss 0.360002816314
 449 / 578 The train loss 0.360524565371
 450 / 578 The train loss 0.360615076679
 451 / 578 The train loss 0.360393158009
 452 / 578 The train loss 0.360113916268
 453 / 578 The train loss 0.359822230516
 454 / 578 The train loss 0.359312546403
 455 / 578 The train loss 0.358650462387
 456 / 578 The train loss 0.358081114911
 457 / 578 The train loss 0.357502364168
 458 / 578 The train loss 0.356909523763
 459 / 578 The train loss 0.356222764778
 460 / 578 The train loss 0.355562390476
 461 / 578 The train loss 0.354894468427
 462 / 578 The train loss 0.3542976884
 463 / 578 The train loss 0.354946699581
 464 / 578 The train loss 0.35589044774
 465 / 578 The train loss 0.355657366287
 466 / 578 The train loss 0.355133635436
 467 / 578 The train loss 0.354511804879
 468 / 578 The train loss 0.353819792609
 469 / 578 The train loss 0.353257905855
 470 / 578 The train loss 0.352810321554
 471 / 578 The train loss 0.352326511589
 472 / 578 The train loss 0.351890173958
 473 / 578 The train loss 0.351335787933
 474 / 578 The train loss 0.350797032973
 475 / 578 The train loss 0.350291037979
 476 / 578 The train loss 0.349732801893
 477 / 578 The train loss 0.349238452152
 478 / 578 The train loss 0.348809986165
 479 / 578 The train loss 0.348431288181
 480 / 578 The train loss 0.348055384879
 481 / 578 The train loss 0.347858479711
 482 / 578 The train loss 0.347616998344
 483 / 578 The train loss 0.347235693614
 484 / 578 The train loss 0.346909631302
 485 / 578 The train loss 0.346440097548
 486 / 578 The train loss 0.345915350921
 487 / 578 The train loss 0.345541561191
 488 / 578 The train loss 0.34530034954
 489 / 578 The train loss 0.344956506733
 490 / 578 The train loss 0.344733529306
 491 / 578 The train loss 0.344368431691
 492 / 578 The train loss 0.343969332156
 493 / 578 The train loss 0.343795757956
 494 / 578 The train loss 0.343575207747
 495 / 578 The train loss 0.343290157176
 496 / 578 The train loss 0.343392455316
 497 / 578 The train loss 0.34393034953
 498 / 578 The train loss 0.34408550763
 499 / 578 The train loss 0.344087993599
 500 / 578 The train loss 0.343899484958
 501 / 578 The train loss 0.343564050428
 502 / 578 The train loss 0.343245489749
 503 / 578 The train loss 0.342915153079
 504 / 578 The train loss 0.342638944139
 505 / 578 The train loss 0.34234165304
 506 / 578 The train loss 0.341934056942
 507 / 578 The train loss 0.341571856441
 508 / 578 The train loss 0.341083925976
 509 / 578 The train loss 0.340568807799
 510 / 578 The train loss 0.340141213652
 511 / 578 The train loss 0.339719866073
 512 / 578 The train loss 0.339418349915
 513 / 578 The train loss 0.339106212419
 514 / 578 The train loss 0.338636809746
 515 / 578 The train loss 0.338384698342
 516 / 578 The train loss 0.338285341942
 517 / 578 The train loss 0.338032199949
 518 / 578 The train loss 0.337647436694
 519 / 578 The train loss 0.337283780719
 520 / 578 The train loss 0.336995504506
 521 / 578 The train loss 0.336697003659
 522 / 578 The train loss 0.336215800991
 523 / 578 The train loss 0.335782453215
 524 / 578 The train loss 0.335441378518
 525 / 578 The train loss 0.335156901163
 526 / 578 The train loss 0.335049963719
 527 / 578 The train loss 0.334877212198
 528 / 578 The train loss 0.334568651023
 529 / 578 The train loss 0.334405574937
 530 / 578 The train loss 0.334177428199
 531 / 578 The train loss 0.33399902382
 532 / 578 The train loss 0.333913116971
 533 / 578 The train loss 0.333938021258
 534 / 578 The train loss 0.333726318166
 535 / 578 The train loss 0.333235403196
 536 / 578 The train loss 0.332758424633
 537 / 578 The train loss 0.332430127998
 538 / 578 The train loss 0.332172633239
 539 / 578 The train loss 0.331811557864
 540 / 578 The train loss 0.331381644457
 541 / 578 The train loss 0.331050965608
 542 / 578 The train loss 0.330778180024
 543 / 578 The train loss 0.330453263609
 544 / 578 The train loss 0.330263150044
 545 / 578 The train loss 0.330440105932
 546 / 578 The train loss 0.331836086336
 547 / 578 The train loss 0.333623287076
 548 / 578 The train loss 0.336751460441
 549 / 578 The train loss 0.338608802189
 550 / 578 The train loss 0.339729831283
 551 / 578 The train loss 0.340646036263
 552 / 578 The train loss 0.341362436353
 553 / 578 The train loss 0.341936668327
 554 / 578 The train loss 0.342492730673
 555 / 578 The train loss 0.342993780256
 556 / 578 The train loss 0.343460973175
 557 / 578 The train loss 0.344446373007
 558 / 578 The train loss 0.344927796496
 559 / 578 The train loss 0.345118202239
 560 / 578 The train loss 0.345006459961
 561 / 578 The train loss 0.344691433833
 562 / 578 The train loss 0.344326390129
 563 / 578 The train loss 0.344152688037
 564 / 578 The train loss 0.343875232196
 565 / 578 The train loss 0.343517464488
 566 / 578 The train loss 0.343123431298
 567 / 578 The train loss 0.342703221438
 568 / 578 The train loss 0.34221122769
 569 / 578 The train loss 0.341651507995
 570 / 578 The train loss 0.341134874735
 571 / 578 The train loss 0.340742885665
 572 / 578 The train loss 0.340327674122
 573 / 578 The train loss 0.340446392334
 574 / 578 The train loss 0.340654738432
 575 / 578 The train loss 0.340656285393
 576 / 578 The train loss 0.34064382284
 577 / 578 The train loss 0.340558733852
 578 / 578 The train loss 0.340436501748

Starting epoch 14
Validation:
 1 / 30 The valid loss 0.396561205387
 2 / 30 The valid loss 0.367727518082
 3 / 30 The valid loss 0.335392594337
 4 / 30 The valid loss 0.278526620939
 5 / 30 The valid loss 0.247280539572
 6 / 30 The valid loss 0.273715010534
 7 / 30 The valid loss 0.372606610613
 8 / 30 The valid loss 0.50729224924
 9 / 30 The valid loss 0.643750145204
 10 / 30 The valid loss 0.717086035758
 11 / 30 The valid loss 0.742998053404
 12 / 30 The valid loss 0.756108970071
 13 / 30 The valid loss 0.764925581331
 14 / 30 The valid loss 0.77045944812
 15 / 30 The valid loss 0.762035088241
 16 / 30 The valid loss 0.761598445941
 17 / 30 The valid loss 0.776557670358
 18 / 30 The valid loss 0.785313245737
 19 / 30 The valid loss 0.784621963768
 20 / 30 The valid loss 0.781027421728
 21 / 30 The valid loss 0.764486199688
 22 / 30 The valid loss 0.748506734656
 23 / 30 The valid loss 0.735235284528
 24 / 30 The valid loss 0.724379973176
 25 / 30 The valid loss 0.710756036341
 26 / 30 The valid loss 0.700758831146
 27 / 30 The valid loss 0.687267992508
 28 / 30 The valid loss 0.670311279329
 29 / 30 The valid loss 0.654268204909
 30 / 30 The valid loss 0.641604083528

Training
 1 / 578 The train loss 0.321450829506
 2 / 578 The train loss 0.201557390392
 3 / 578 The train loss 0.169699475169
 4 / 578 The train loss 0.148990418762
 5 / 578 The train loss 0.135986748338
 6 / 578 The train loss 0.125710392992
 7 / 578 The train loss 0.125556617975
 8 / 578 The train loss 0.122676471248
 9 / 578 The train loss 0.120587726434
 10 / 578 The train loss 0.112394492328
 11 / 578 The train loss 0.106707094068
 12 / 578 The train loss 0.0996844454979
 13 / 578 The train loss 0.0960154023308
 14 / 578 The train loss 0.105025252593
 15 / 578 The train loss 0.109478391707
 16 / 578 The train loss 0.117579821963
 17 / 578 The train loss 0.127632888363
 18 / 578 The train loss 0.135317112423
 19 / 578 The train loss 0.139265274139
 20 / 578 The train loss 0.136935063824
 21 / 578 The train loss 0.134156889149
 22 / 578 The train loss 0.130863693119
 23 / 578 The train loss 0.128722606956
 24 / 578 The train loss 0.125185696874
 25 / 578 The train loss 0.121952980161
 26 / 578 The train loss 0.118970291259
 27 / 578 The train loss 0.116795749024
 28 / 578 The train loss 0.113860933376
 29 / 578 The train loss 0.11080439742
 30 / 578 The train loss 0.108345558432
 31 / 578 The train loss 0.105963795836
 32 / 578 The train loss 0.103756049008
 33 / 578 The train loss 0.102221732395
 34 / 578 The train loss 0.100676423103
 35 / 578 The train loss 0.100149370996
 36 / 578 The train loss 0.100291915911
 37 / 578 The train loss 0.100015845502
 38 / 578 The train loss 0.0999702582332
 39 / 578 The train loss 0.0983730885558
 40 / 578 The train loss 0.0968126862776
 41 / 578 The train loss 0.0949445594466
 42 / 578 The train loss 0.0933965748353
 43 / 578 The train loss 0.0928667775507
 44 / 578 The train loss 0.0915955612859
 45 / 578 The train loss 0.0912421560536
 46 / 578 The train loss 0.0904872946684
 47 / 578 The train loss 0.0892493621387
 48 / 578 The train loss 0.0879898926166
 49 / 578 The train loss 0.0872897900717
 50 / 578 The train loss 0.0883199968562
 51 / 578 The train loss 0.0894217487863
 52 / 578 The train loss 0.0899989481371
 53 / 578 The train loss 0.0895905769869
 54 / 578 The train loss 0.0888229335003
 55 / 578 The train loss 0.0878189732744
 56 / 578 The train loss 0.0866910674875
 57 / 578 The train loss 0.0859154713781
 58 / 578 The train loss 0.0865364519173
 59 / 578 The train loss 0.0883170328403
 60 / 578 The train loss 0.090944716831
 61 / 578 The train loss 0.0955599087184
 62 / 578 The train loss 0.100297133769
 63 / 578 The train loss 0.103497675014
 64 / 578 The train loss 0.105620239628
 65 / 578 The train loss 0.106584499662
 66 / 578 The train loss 0.10599357409
 67 / 578 The train loss 0.105038920779
 68 / 578 The train loss 0.104899075868
 69 / 578 The train loss 0.104133720415
 70 / 578 The train loss 0.103133395155
 71 / 578 The train loss 0.102190411322
 72 / 578 The train loss 0.101234184371
 73 / 578 The train loss 0.101126925178
 74 / 578 The train loss 0.103359576415
 75 / 578 The train loss 0.107101593415
 76 / 578 The train loss 0.109537386188
 77 / 578 The train loss 0.111153994675
 78 / 578 The train loss 0.113414197014
 79 / 578 The train loss 0.117407615049
 80 / 578 The train loss 0.120437491685
 81 / 578 The train loss 0.124012808005
 82 / 578 The train loss 0.126711144316
 83 / 578 The train loss 0.128664331982
 84 / 578 The train loss 0.131537412249
 85 / 578 The train loss 0.134432643301
 86 / 578 The train loss 0.13755463237
 87 / 578 The train loss 0.139818208656
 88 / 578 The train loss 0.142205566845
 89 / 578 The train loss 0.148848094967
 90 / 578 The train loss 0.152773697509
 91 / 578 The train loss 0.152388324613
 92 / 578 The train loss 0.153012111621
 93 / 578 The train loss 0.152984973724
 94 / 578 The train loss 0.152893078454
 95 / 578 The train loss 0.152901367922
 96 / 578 The train loss 0.153349890218
 97 / 578 The train loss 0.154469371303
 98 / 578 The train loss 0.155817795621
 99 / 578 The train loss 0.15715395757
 100 / 578 The train loss 0.158770276457
 101 / 578 The train loss 0.161079398329
 102 / 578 The train loss 0.161052117745
 103 / 578 The train loss 0.160452266775
 104 / 578 The train loss 0.159819207512
 105 / 578 The train loss 0.159391455778
 106 / 578 The train loss 0.15976005645
 107 / 578 The train loss 0.159977060821
 108 / 578 The train loss 0.160298296078
 109 / 578 The train loss 0.16052597768
 110 / 578 The train loss 0.16041344113
 111 / 578 The train loss 0.159995755768
 112 / 578 The train loss 0.159603269704
 113 / 578 The train loss 0.160463644538
 114 / 578 The train loss 0.162108741766
 115 / 578 The train loss 0.164123261947
 116 / 578 The train loss 0.167831399253
 117 / 578 The train loss 0.170248337727
 118 / 578 The train loss 0.173323384723
 119 / 578 The train loss 0.176825564887
 120 / 578 The train loss 0.180263696052
 121 / 578 The train loss 0.183750950539
 122 / 578 The train loss 0.187886222892
 123 / 578 The train loss 0.193447347518
 124 / 578 The train loss 0.202338411863
 125 / 578 The train loss 0.210681478441
 126 / 578 The train loss 0.217185669416
 127 / 578 The train loss 0.225378297387
 128 / 578 The train loss 0.233871039876
 129 / 578 The train loss 0.241870976176
 130 / 578 The train loss 0.250456620122
 131 / 578 The train loss 0.257956456535
 132 / 578 The train loss 0.261571384233
 133 / 578 The train loss 0.263013283281
 134 / 578 The train loss 0.263442369039
 135 / 578 The train loss 0.26454145373
 136 / 578 The train loss 0.266744996202
 137 / 578 The train loss 0.269035032576
 138 / 578 The train loss 0.270183700528
 139 / 578 The train loss 0.270503430004
 140 / 578 The train loss 0.269658200177
 141 / 578 The train loss 0.268629397906
 142 / 578 The train loss 0.26857449639
 143 / 578 The train loss 0.269301049776
 144 / 578 The train loss 0.269338981559
 145 / 578 The train loss 0.268866628614
 146 / 578 The train loss 0.268873773209
 147 / 578 The train loss 0.268763187386
 148 / 578 The train loss 0.268295682262
 149 / 578 The train loss 0.267829812913
 150 / 578 The train loss 0.267142246068
 151 / 578 The train loss 0.266134146823
 152 / 578 The train loss 0.265354897925
 153 / 578 The train loss 0.264794105408
 154 / 578 The train loss 0.263893925041
 155 / 578 The train loss 0.263408959393
 156 / 578 The train loss 0.262679611548
 157 / 578 The train loss 0.262443454118
 158 / 578 The train loss 0.261767883561
 159 / 578 The train loss 0.260692430274
 160 / 578 The train loss 0.259987084009
 161 / 578 The train loss 0.259499376605
 162 / 578 The train loss 0.258695925367
 163 / 578 The train loss 0.258009480644
 164 / 578 The train loss 0.25683301869
 165 / 578 The train loss 0.255769511741
 166 / 578 The train loss 0.254838287785
 167 / 578 The train loss 0.253991328804
 168 / 578 The train loss 0.25721766813
 169 / 578 The train loss 0.26049464531
 170 / 578 The train loss 0.263795521461
 171 / 578 The train loss 0.266306851125
 172 / 578 The train loss 0.268685401656
 173 / 578 The train loss 0.272010187322
 174 / 578 The train loss 0.275192715784
 175 / 578 The train loss 0.278382012972
 176 / 578 The train loss 0.28080516118
 177 / 578 The train loss 0.282745334178
 178 / 578 The train loss 0.284818895203
 179 / 578 The train loss 0.286486622799
 180 / 578 The train loss 0.28758379134
 181 / 578 The train loss 0.288226101568
 182 / 578 The train loss 0.28811479102
 183 / 578 The train loss 0.287517724101
 184 / 578 The train loss 0.288750320632
 185 / 578 The train loss 0.292362422516
 186 / 578 The train loss 0.293848650069
 187 / 578 The train loss 0.29575384711
 188 / 578 The train loss 0.301426442023
 189 / 578 The train loss 0.310258181914
 190 / 578 The train loss 0.319636823276
 191 / 578 The train loss 0.32537049176
 192 / 578 The train loss 0.32996557618
 193 / 578 The train loss 0.334188678374
 194 / 578 The train loss 0.336650702718
 195 / 578 The train loss 0.336476248006
 196 / 578 The train loss 0.336798253526
 197 / 578 The train loss 0.338299777113
 198 / 578 The train loss 0.340814063457
 199 / 578 The train loss 0.344506643937
 200 / 578 The train loss 0.348536721878
 201 / 578 The train loss 0.351303435195
 202 / 578 The train loss 0.355159648685
 203 / 578 The train loss 0.358573462686
 204 / 578 The train loss 0.360790954611
 205 / 578 The train loss 0.360366994106
 206 / 578 The train loss 0.360782180994
 207 / 578 The train loss 0.361425890115
 208 / 578 The train loss 0.361615603037
 209 / 578 The train loss 0.361725791041
 210 / 578 The train loss 0.361592251453
 211 / 578 The train loss 0.361293169524
 212 / 578 The train loss 0.360382355574
 213 / 578 The train loss 0.359665291515
 214 / 578 The train loss 0.359235013861
 215 / 578 The train loss 0.359202260299
 216 / 578 The train loss 0.358335517243
 217 / 578 The train loss 0.357173673809
 218 / 578 The train loss 0.356095269015
 219 / 578 The train loss 0.355164876097
 220 / 578 The train loss 0.35400664444
 221 / 578 The train loss 0.353044755217
 222 / 578 The train loss 0.35259040388
 223 / 578 The train loss 0.352698526876
 224 / 578 The train loss 0.353049233589
 225 / 578 The train loss 0.353297888603
 226 / 578 The train loss 0.353703048014
 227 / 578 The train loss 0.354285850169
 228 / 578 The train loss 0.354758356387
 229 / 578 The train loss 0.354794097682
 230 / 578 The train loss 0.354848108376
 231 / 578 The train loss 0.354448190015
 232 / 578 The train loss 0.353945132634
 233 / 578 The train loss 0.353982757249
 234 / 578 The train loss 0.353926476027
 235 / 578 The train loss 0.353659375075
 236 / 578 The train loss 0.352729719037
 237 / 578 The train loss 0.352855789731
 238 / 578 The train loss 0.356013416962
 239 / 578 The train loss 0.358249817651
 240 / 578 The train loss 0.358008517915
 241 / 578 The train loss 0.356917076462
 242 / 578 The train loss 0.356325189252
 243 / 578 The train loss 0.356452398401
 244 / 578 The train loss 0.356801124076
 245 / 578 The train loss 0.357152630906
 246 / 578 The train loss 0.357500108822
 247 / 578 The train loss 0.356491196614
 248 / 578 The train loss 0.355659003219
 249 / 578 The train loss 0.354953375626
 250 / 578 The train loss 0.354957784832
 251 / 578 The train loss 0.355830063561
 252 / 578 The train loss 0.357138932638
 253 / 578 The train loss 0.359296840582
 254 / 578 The train loss 0.361932123212
 255 / 578 The train loss 0.364099805904
 256 / 578 The train loss 0.366976825928
 257 / 578 The train loss 0.370705811836
 258 / 578 The train loss 0.373356046545
 259 / 578 The train loss 0.374358312906
 260 / 578 The train loss 0.374970698529
 261 / 578 The train loss 0.374783312966
 262 / 578 The train loss 0.374671872051
 263 / 578 The train loss 0.374491352942
 264 / 578 The train loss 0.374329203966
 265 / 578 The train loss 0.374397912284
 266 / 578 The train loss 0.374503568543
 267 / 578 The train loss 0.374469315012
 268 / 578 The train loss 0.374082648876
 269 / 578 The train loss 0.374040026481
 270 / 578 The train loss 0.375396246215
 271 / 578 The train loss 0.37561934752
 272 / 578 The train loss 0.37502494416
 273 / 578 The train loss 0.374688155867
 274 / 578 The train loss 0.374030596855
 275 / 578 The train loss 0.372954615002
 276 / 578 The train loss 0.372014904179
 277 / 578 The train loss 0.371167832468
 278 / 578 The train loss 0.370682550661
 279 / 578 The train loss 0.369721837769
 280 / 578 The train loss 0.36975961293
 281 / 578 The train loss 0.369310869202
 282 / 578 The train loss 0.368543025635
 283 / 578 The train loss 0.368549458735
 284 / 578 The train loss 0.367881184129
 285 / 578 The train loss 0.367405992982
 286 / 578 The train loss 0.366629488688
 287 / 578 The train loss 0.366045691488
 288 / 578 The train loss 0.365636319258
 289 / 578 The train loss 0.365442955808
 290 / 578 The train loss 0.365238448387
 291 / 578 The train loss 0.365218743451
 292 / 578 The train loss 0.36511081327
 293 / 578 The train loss 0.364666517862
 294 / 578 The train loss 0.364063519072
 295 / 578 The train loss 0.363622504094
 296 / 578 The train loss 0.362889908028
 297 / 578 The train loss 0.362135227356
 298 / 578 The train loss 0.361354448976
 299 / 578 The train loss 0.360921416668
 300 / 578 The train loss 0.360951127484
 301 / 578 The train loss 0.360996573842
 302 / 578 The train loss 0.361173682828
 303 / 578 The train loss 0.361375628599
 304 / 578 The train loss 0.361218807018
 305 / 578 The train loss 0.361403356004
 306 / 578 The train loss 0.361678744916
 307 / 578 The train loss 0.361523270486
 308 / 578 The train loss 0.361757865659
 309 / 578 The train loss 0.361702490176
 310 / 578 The train loss 0.361314891303
 311 / 578 The train loss 0.360769405458
 312 / 578 The train loss 0.360567914967
 313 / 578 The train loss 0.360187700262
 314 / 578 The train loss 0.359508198324
 315 / 578 The train loss 0.358640552963
 316 / 578 The train loss 0.358869766392
 317 / 578 The train loss 0.359459390094
 318 / 578 The train loss 0.359592397516
 319 / 578 The train loss 0.359653747264
 320 / 578 The train loss 0.359537992231
 321 / 578 The train loss 0.359143141297
 322 / 578 The train loss 0.358747952404
 323 / 578 The train loss 0.358504413659
 324 / 578 The train loss 0.357640649847
 325 / 578 The train loss 0.3570745674
 326 / 578 The train loss 0.356623874714
 327 / 578 The train loss 0.355718403143
 328 / 578 The train loss 0.354704779266
 329 / 578 The train loss 0.353752760098
 330 / 578 The train loss 0.352738908469
 331 / 578 The train loss 0.351828014298
 332 / 578 The train loss 0.350926517167
 333 / 578 The train loss 0.350004151167
 334 / 578 The train loss 0.349208972495
 335 / 578 The train loss 0.348402538712
 336 / 578 The train loss 0.348072657533
 337 / 578 The train loss 0.347647305274
 338 / 578 The train loss 0.347171950705
 339 / 578 The train loss 0.346740883091
 340 / 578 The train loss 0.345961193297
 341 / 578 The train loss 0.345041952902
 342 / 578 The train loss 0.344152368022
 343 / 578 The train loss 0.343399852085
 344 / 578 The train loss 0.342613036468
 345 / 578 The train loss 0.3419352533
 346 / 578 The train loss 0.341234432442
 347 / 578 The train loss 0.340877476409
 348 / 578 The train loss 0.340631272478
 349 / 578 The train loss 0.340402310981
 350 / 578 The train loss 0.339844416981
 351 / 578 The train loss 0.339337442681
 352 / 578 The train loss 0.338897518988
 353 / 578 The train loss 0.338381739587
 354 / 578 The train loss 0.338159327206
 355 / 578 The train loss 0.338577085823
 356 / 578 The train loss 0.338540390558
 357 / 578 The train loss 0.338572186337
 358 / 578 The train loss 0.338384266261
 359 / 578 The train loss 0.338070096043
 360 / 578 The train loss 0.338299802572
 361 / 578 The train loss 0.338572868314
 362 / 578 The train loss 0.338382127664
 363 / 578 The train loss 0.338145778163
 364 / 578 The train loss 0.337865227777
 365 / 578 The train loss 0.337388765245
 366 / 578 The train loss 0.3366120041
 367 / 578 The train loss 0.335870098236
 368 / 578 The train loss 0.335102398693
 369 / 578 The train loss 0.334286404309
 370 / 578 The train loss 0.333512793436
 371 / 578 The train loss 0.332710461932
 372 / 578 The train loss 0.331938770615
 373 / 578 The train loss 0.331226286581
 374 / 578 The train loss 0.330560454616
 375 / 578 The train loss 0.33005034326
 376 / 578 The train loss 0.329391854067
 377 / 578 The train loss 0.328591802274
 378 / 578 The train loss 0.327805618278
 379 / 578 The train loss 0.327085008221
 380 / 578 The train loss 0.326388422802
 381 / 578 The train loss 0.325954606771
 382 / 578 The train loss 0.32587378867
 383 / 578 The train loss 0.325589660613
 384 / 578 The train loss 0.325163139777
 385 / 578 The train loss 0.324793111823
 386 / 578 The train loss 0.324253982879
 387 / 578 The train loss 0.323625924581
 388 / 578 The train loss 0.322930801929
 389 / 578 The train loss 0.322327945295
 390 / 578 The train loss 0.321565677278
 391 / 578 The train loss 0.320806034753
 392 / 578 The train loss 0.320095572796
 393 / 578 The train loss 0.319516879822
 394 / 578 The train loss 0.319075776902
 395 / 578 The train loss 0.319030943612
 396 / 578 The train loss 0.318649320313
 397 / 578 The train loss 0.318091762086
 398 / 578 The train loss 0.317389729957
 399 / 578 The train loss 0.316735342084
 400 / 578 The train loss 0.31613040132
 401 / 578 The train loss 0.315607521725
 402 / 578 The train loss 0.315002784332
 403 / 578 The train loss 0.314278697669
 404 / 578 The train loss 0.313651288597
 405 / 578 The train loss 0.313014965113
 406 / 578 The train loss 0.312374170379
 407 / 578 The train loss 0.311770660235
 408 / 578 The train loss 0.311240942532
 409 / 578 The train loss 0.312779022098
 410 / 578 The train loss 0.314765124573
 411 / 578 The train loss 0.31561998795
 412 / 578 The train loss 0.315630755093
 413 / 578 The train loss 0.315360102329
 414 / 578 The train loss 0.315504884235
 415 / 578 The train loss 0.31615546181
 416 / 578 The train loss 0.316121945234
 417 / 578 The train loss 0.315734607786
 418 / 578 The train loss 0.315140738446
 419 / 578 The train loss 0.314497467508
 420 / 578 The train loss 0.313855443899
 421 / 578 The train loss 0.313283278251
 422 / 578 The train loss 0.312913432633
 423 / 578 The train loss 0.312570616167
 424 / 578 The train loss 0.312225410315
 425 / 578 The train loss 0.312052921464
 426 / 578 The train loss 0.311940364352
 427 / 578 The train loss 0.312215024317
 428 / 578 The train loss 0.31225612055
 429 / 578 The train loss 0.311940909913
 430 / 578 The train loss 0.311579881838
 431 / 578 The train loss 0.310994528722
 432 / 578 The train loss 0.310323237209
 433 / 578 The train loss 0.30979777722
 434 / 578 The train loss 0.309836439385
 435 / 578 The train loss 0.309797788874
 436 / 578 The train loss 0.309437415443
 437 / 578 The train loss 0.309019835824
 438 / 578 The train loss 0.308621859824
 439 / 578 The train loss 0.308306318863
 440 / 578 The train loss 0.30806088699
 441 / 578 The train loss 0.307743820955
 442 / 578 The train loss 0.307216495149
 443 / 578 The train loss 0.306765992783
 444 / 578 The train loss 0.306616217645
 445 / 578 The train loss 0.307006324618
 446 / 578 The train loss 0.307600223773
 447 / 578 The train loss 0.307753087232
 448 / 578 The train loss 0.307808101718
 449 / 578 The train loss 0.307749191287
 450 / 578 The train loss 0.307584733607
 451 / 578 The train loss 0.307509087976
 452 / 578 The train loss 0.307448973278
 453 / 578 The train loss 0.306991681021
 454 / 578 The train loss 0.306511145635
 455 / 578 The train loss 0.306139542351
 456 / 578 The train loss 0.305721191077
 457 / 578 The train loss 0.305455562167
 458 / 578 The train loss 0.305058424349
 459 / 578 The train loss 0.304555193556
 460 / 578 The train loss 0.303989133368
 461 / 578 The train loss 0.303532137513
 462 / 578 The train loss 0.303088226811
 463 / 578 The train loss 0.303602244439
 464 / 578 The train loss 0.304044931549
 465 / 578 The train loss 0.304050740632
 466 / 578 The train loss 0.303830525394
 467 / 578 The train loss 0.303366637125
 468 / 578 The train loss 0.303136127738
 469 / 578 The train loss 0.303234873661
 470 / 578 The train loss 0.303145226757
 471 / 578 The train loss 0.302695734296
 472 / 578 The train loss 0.302294474758
 473 / 578 The train loss 0.301939215739
 474 / 578 The train loss 0.301527730712
 475 / 578 The train loss 0.301064049006
 476 / 578 The train loss 0.300576548319
 477 / 578 The train loss 0.300083112214
 478 / 578 The train loss 0.299653593731
 479 / 578 The train loss 0.299236028344
 480 / 578 The train loss 0.298950989249
 481 / 578 The train loss 0.29874662536
 482 / 578 The train loss 0.298675243275
 483 / 578 The train loss 0.298436412576
 484 / 578 The train loss 0.29820029319
 485 / 578 The train loss 0.297935742524
 486 / 578 The train loss 0.297532623917
 487 / 578 The train loss 0.297320950922
 488 / 578 The train loss 0.297069237842
 489 / 578 The train loss 0.296840905671
 490 / 578 The train loss 0.296792325332
 491 / 578 The train loss 0.296867552834
 492 / 578 The train loss 0.296770413005
 493 / 578 The train loss 0.296557743261
 494 / 578 The train loss 0.296235885441
 495 / 578 The train loss 0.295992233732
 496 / 578 The train loss 0.295942548918
 497 / 578 The train loss 0.296256169782
 498 / 578 The train loss 0.296650391581
 499 / 578 The train loss 0.296742241061
 500 / 578 The train loss 0.296488373086
 501 / 578 The train loss 0.29619671156
 502 / 578 The train loss 0.295822314606
 503 / 578 The train loss 0.295500062939
 504 / 578 The train loss 0.295352099242
 505 / 578 The train loss 0.295027625487
 506 / 578 The train loss 0.294666786969
 507 / 578 The train loss 0.29431409183
 508 / 578 The train loss 0.293847993896
 509 / 578 The train loss 0.293434764202
 510 / 578 The train loss 0.29303077135
 511 / 578 The train loss 0.292601865478
 512 / 578 The train loss 0.292246922603
 513 / 578 The train loss 0.291983066813
 514 / 578 The train loss 0.291713893718
 515 / 578 The train loss 0.291494234485
 516 / 578 The train loss 0.291197635854
 517 / 578 The train loss 0.291067946306
 518 / 578 The train loss 0.291003293527
 519 / 578 The train loss 0.290859175225
 520 / 578 The train loss 0.291100417264
 521 / 578 The train loss 0.291033635913
 522 / 578 The train loss 0.29061433543
 523 / 578 The train loss 0.290318018805
 524 / 578 The train loss 0.290063752089
 525 / 578 The train loss 0.28988588695
 526 / 578 The train loss 0.289541170454
 527 / 578 The train loss 0.289330475442
 528 / 578 The train loss 0.289224422198
 529 / 578 The train loss 0.289033114713
 530 / 578 The train loss 0.28890448266
 531 / 578 The train loss 0.288784491926
 532 / 578 The train loss 0.288748435228
 533 / 578 The train loss 0.28873187639
 534 / 578 The train loss 0.288424202607
 535 / 578 The train loss 0.288099323366
 536 / 578 The train loss 0.287772284876
 537 / 578 The train loss 0.287507341866
 538 / 578 The train loss 0.287487691912
 539 / 578 The train loss 0.287298722286
 540 / 578 The train loss 0.286980508903
 541 / 578 The train loss 0.286766278711
 542 / 578 The train loss 0.286676816285
 543 / 578 The train loss 0.286349789406
 544 / 578 The train loss 0.286232348778
 545 / 578 The train loss 0.286384040418
 546 / 578 The train loss 0.287417031934
 547 / 578 The train loss 0.289337810939
 548 / 578 The train loss 0.29267980346
 549 / 578 The train loss 0.294493295266
 550 / 578 The train loss 0.295640710294
 551 / 578 The train loss 0.296343014401
 552 / 578 The train loss 0.29679326813
 553 / 578 The train loss 0.297378750676
 554 / 578 The train loss 0.297730536856
 555 / 578 The train loss 0.297833429022
 556 / 578 The train loss 0.29786489794
 557 / 578 The train loss 0.297815437404
 558 / 578 The train loss 0.297547429186
 559 / 578 The train loss 0.297607793764
 560 / 578 The train loss 0.297612208367
 561 / 578 The train loss 0.297340530364
 562 / 578 The train loss 0.297007898525
 563 / 578 The train loss 0.296942166865
 564 / 578 The train loss 0.29684992692
 565 / 578 The train loss 0.296501288794
 566 / 578 The train loss 0.296135923782
 567 / 578 The train loss 0.295829222395
 568 / 578 The train loss 0.295521427156
 569 / 578 The train loss 0.295145846196
 570 / 578 The train loss 0.294764462937
 571 / 578 The train loss 0.294394136062
 572 / 578 The train loss 0.293938924252
 573 / 578 The train loss 0.293951277775
 574 / 578 The train loss 0.293849405615
 575 / 578 The train loss 0.29358403763
 576 / 578 The train loss 0.293699123819
 577 / 578 The train loss 0.293975525624
 578 / 578 The train loss 0.294186117353

Starting epoch 15
Validation:
 1 / 30 The valid loss 0.403793215752
 2 / 30 The valid loss 0.390434771776
 3 / 30 The valid loss 0.359965632359
 4 / 30 The valid loss 0.288358191028
 5 / 30 The valid loss 0.24877409339
 6 / 30 The valid loss 0.274084508419
 7 / 30 The valid loss 0.37223477023
 8 / 30 The valid loss 0.510747417808
 9 / 30 The valid loss 0.650572339694
 10 / 30 The valid loss 0.723547029495
 11 / 30 The valid loss 0.765079725872
 12 / 30 The valid loss 0.786363164584
 13 / 30 The valid loss 0.799053182969
 14 / 30 The valid loss 0.808880303587
 15 / 30 The valid loss 0.803000879288
 16 / 30 The valid loss 0.805204946548
 17 / 30 The valid loss 0.824568808079
 18 / 30 The valid loss 0.83740904265
 19 / 30 The valid loss 0.83839165537
 20 / 30 The valid loss 0.834356185794
 21 / 30 The valid loss 0.81683722848
 22 / 30 The valid loss 0.798889272592
 23 / 30 The valid loss 0.783811323021
 24 / 30 The valid loss 0.771138694137
 25 / 30 The valid loss 0.755037533045
 26 / 30 The valid loss 0.742494295423
 27 / 30 The valid loss 0.72485341518
 28 / 30 The valid loss 0.705200816904
 29 / 30 The valid loss 0.688617017249
 30 / 30 The valid loss 0.676348614196

Training
 1 / 578 The train loss 0.379775941372
 2 / 578 The train loss 0.245047815144
 3 / 578 The train loss 0.218325520555
 4 / 578 The train loss 0.240250576288
 5 / 578 The train loss 0.249354621768
 6 / 578 The train loss 0.239347736041
 7 / 578 The train loss 0.233151242137
 8 / 578 The train loss 0.228250725195
 9 / 578 The train loss 0.216273170378
 10 / 578 The train loss 0.2095610708
 11 / 578 The train loss 0.20709211447
 12 / 578 The train loss 0.201133703192
 13 / 578 The train loss 0.188111554545
 14 / 578 The train loss 0.186650243721
 15 / 578 The train loss 0.186145270367
 16 / 578 The train loss 0.184806599747
 17 / 578 The train loss 0.185441788943
 18 / 578 The train loss 0.187362302095
 19 / 578 The train loss 0.186908332533
 20 / 578 The train loss 0.180926799029
 21 / 578 The train loss 0.173468187274
 22 / 578 The train loss 0.167806106874
 23 / 578 The train loss 0.162326182684
 24 / 578 The train loss 0.15704724829
 25 / 578 The train loss 0.152428629249
 26 / 578 The train loss 0.14792177425
 27 / 578 The train loss 0.144221957911
 28 / 578 The train loss 0.140079865897
 29 / 578 The train loss 0.136095682994
 30 / 578 The train loss 0.132447094284
 31 / 578 The train loss 0.129419810169
 32 / 578 The train loss 0.126179150073
 33 / 578 The train loss 0.123492919473
 34 / 578 The train loss 0.120389758883
 35 / 578 The train loss 0.119136603602
 36 / 578 The train loss 0.119019394327
 37 / 578 The train loss 0.118930169658
 38 / 578 The train loss 0.117961888054
 39 / 578 The train loss 0.115569104608
 40 / 578 The train loss 0.113533377787
 41 / 578 The train loss 0.110969019191
 42 / 578 The train loss 0.108789683275
 43 / 578 The train loss 0.107356918586
 44 / 578 The train loss 0.106091903777
 45 / 578 The train loss 0.10534304339
 46 / 578 The train loss 0.104297451108
 47 / 578 The train loss 0.102750538511
 48 / 578 The train loss 0.102049765643
 49 / 578 The train loss 0.101589715602
 50 / 578 The train loss 0.102805202603
 51 / 578 The train loss 0.102756932816
 52 / 578 The train loss 0.103503624264
 53 / 578 The train loss 0.103051960328
 54 / 578 The train loss 0.102120614576
 55 / 578 The train loss 0.101055692001
 56 / 578 The train loss 0.0997328596589
 57 / 578 The train loss 0.0989493810033
 58 / 578 The train loss 0.100152307158
 59 / 578 The train loss 0.101645714272
 60 / 578 The train loss 0.104386460409
 61 / 578 The train loss 0.10927657722
 62 / 578 The train loss 0.116047776635
 63 / 578 The train loss 0.120106843374
 64 / 578 The train loss 0.122957142186
 65 / 578 The train loss 0.124456454584
 66 / 578 The train loss 0.123528654264
 67 / 578 The train loss 0.122373652825
 68 / 578 The train loss 0.121822023819
 69 / 578 The train loss 0.120682898909
 70 / 578 The train loss 0.119697547544
 71 / 578 The train loss 0.119200745571
 72 / 578 The train loss 0.119122032542
 73 / 578 The train loss 0.118867144958
 74 / 578 The train loss 0.121611175052
 75 / 578 The train loss 0.128078720917
 76 / 578 The train loss 0.130792415799
 77 / 578 The train loss 0.131729269521
 78 / 578 The train loss 0.132854065213
 79 / 578 The train loss 0.135276263792
 80 / 578 The train loss 0.136997671193
 81 / 578 The train loss 0.13930958928
 82 / 578 The train loss 0.140802327225
 83 / 578 The train loss 0.142800818141
 84 / 578 The train loss 0.145643992926
 85 / 578 The train loss 0.146989496447
 86 / 578 The train loss 0.147730597706
 87 / 578 The train loss 0.150084038704
 88 / 578 The train loss 0.152499263386
 89 / 578 The train loss 0.157429927646
 90 / 578 The train loss 0.160113558297
 91 / 578 The train loss 0.159894290002
 92 / 578 The train loss 0.160287199582
 93 / 578 The train loss 0.159994462285
 94 / 578 The train loss 0.160519722572
 95 / 578 The train loss 0.161428071284
 96 / 578 The train loss 0.161297526172
 97 / 578 The train loss 0.161491314737
 98 / 578 The train loss 0.161480892792
 99 / 578 The train loss 0.161476539203
 100 / 578 The train loss 0.162694377862
 101 / 578 The train loss 0.162930934257
 102 / 578 The train loss 0.162040864245
 103 / 578 The train loss 0.161016410524
 104 / 578 The train loss 0.159882554605
 105 / 578 The train loss 0.158754501492
 106 / 578 The train loss 0.158138909254
 107 / 578 The train loss 0.157586957215
 108 / 578 The train loss 0.156741352648
 109 / 578 The train loss 0.156538390478
 110 / 578 The train loss 0.156718521599
 111 / 578 The train loss 0.157304761747
 112 / 578 The train loss 0.156874800866
 113 / 578 The train loss 0.156304350304
 114 / 578 The train loss 0.155876382891
 115 / 578 The train loss 0.155485988797
 116 / 578 The train loss 0.155430066528
 117 / 578 The train loss 0.155699024693
 118 / 578 The train loss 0.156526824523
 119 / 578 The train loss 0.157627320434
 120 / 578 The train loss 0.159081471805
 121 / 578 The train loss 0.160216400785
 122 / 578 The train loss 0.162121320449
 123 / 578 The train loss 0.165024537472
 124 / 578 The train loss 0.171693469338
 125 / 578 The train loss 0.178481969923
 126 / 578 The train loss 0.183709949344
 127 / 578 The train loss 0.188776002244
 128 / 578 The train loss 0.194982109679
 129 / 578 The train loss 0.202173354176
 130 / 578 The train loss 0.210111020248
 131 / 578 The train loss 0.215161590261
 132 / 578 The train loss 0.21693253187
 133 / 578 The train loss 0.217675808089
 134 / 578 The train loss 0.217456883987
 135 / 578 The train loss 0.217962747674
 136 / 578 The train loss 0.219325628871
 137 / 578 The train loss 0.221232372125
 138 / 578 The train loss 0.221525781909
 139 / 578 The train loss 0.221209195422
 140 / 578 The train loss 0.220949028619
 141 / 578 The train loss 0.220657221048
 142 / 578 The train loss 0.219860520293
 143 / 578 The train loss 0.220149881278
 144 / 578 The train loss 0.220042211547
 145 / 578 The train loss 0.219898368393
 146 / 578 The train loss 0.220386747449
 147 / 578 The train loss 0.22075557914
 148 / 578 The train loss 0.220841841143
 149 / 578 The train loss 0.220989830567
 150 / 578 The train loss 0.22119079036
 151 / 578 The train loss 0.22111650396
 152 / 578 The train loss 0.220879091857
 153 / 578 The train loss 0.2206722393
 154 / 578 The train loss 0.220310242653
 155 / 578 The train loss 0.22111933128
 156 / 578 The train loss 0.221613568111
 157 / 578 The train loss 0.221769536306
 158 / 578 The train loss 0.222009268483
 159 / 578 The train loss 0.221669674787
 160 / 578 The train loss 0.220887673623
 161 / 578 The train loss 0.220326281274
 162 / 578 The train loss 0.220004787025
 163 / 578 The train loss 0.221004804745
 164 / 578 The train loss 0.220600768538
 165 / 578 The train loss 0.220131323712
 166 / 578 The train loss 0.219722837008
 167 / 578 The train loss 0.219543749857
 168 / 578 The train loss 0.223154977119
 169 / 578 The train loss 0.227043656004
 170 / 578 The train loss 0.230463269682
 171 / 578 The train loss 0.232755660815
 172 / 578 The train loss 0.235111217318
 173 / 578 The train loss 0.23753863347
 174 / 578 The train loss 0.239419628299
 175 / 578 The train loss 0.240879373018
 176 / 578 The train loss 0.242782260355
 177 / 578 The train loss 0.243453874574
 178 / 578 The train loss 0.243852526096
 179 / 578 The train loss 0.245326686651
 180 / 578 The train loss 0.246692038721
 181 / 578 The train loss 0.246907309747
 182 / 578 The train loss 0.246421643809
 183 / 578 The train loss 0.24660302493
 184 / 578 The train loss 0.247676220095
 185 / 578 The train loss 0.249526790489
 186 / 578 The train loss 0.254348158656
 187 / 578 The train loss 0.258490256268
 188 / 578 The train loss 0.263369290574
 189 / 578 The train loss 0.267882788445
 190 / 578 The train loss 0.27368821139
 191 / 578 The train loss 0.277724639732
 192 / 578 The train loss 0.27947638601
 193 / 578 The train loss 0.279544517919
 194 / 578 The train loss 0.278781114761
 195 / 578 The train loss 0.278851623394
 196 / 578 The train loss 0.280989386274
 197 / 578 The train loss 0.283567007707
 198 / 578 The train loss 0.285490475802
 199 / 578 The train loss 0.289683339865
 200 / 578 The train loss 0.294478755277
 201 / 578 The train loss 0.299945308479
 202 / 578 The train loss 0.304780334216
 203 / 578 The train loss 0.307409323183
 204 / 578 The train loss 0.307641728526
 205 / 578 The train loss 0.306847720739
 206 / 578 The train loss 0.307236646546
 207 / 578 The train loss 0.307885121446
 208 / 578 The train loss 0.307647397437
 209 / 578 The train loss 0.308143091669
 210 / 578 The train loss 0.307945661584
 211 / 578 The train loss 0.30717163909
 212 / 578 The train loss 0.30609288592
 213 / 578 The train loss 0.305561963154
 214 / 578 The train loss 0.305273483272
 215 / 578 The train loss 0.305626082403
 216 / 578 The train loss 0.305844496228
 217 / 578 The train loss 0.305414339897
 218 / 578 The train loss 0.305514443204
 219 / 578 The train loss 0.305459657471
 220 / 578 The train loss 0.305329326997
 221 / 578 The train loss 0.305171700607
 222 / 578 The train loss 0.304983570752
 223 / 578 The train loss 0.305063411376
 224 / 578 The train loss 0.304990135782
 225 / 578 The train loss 0.304960616943
 226 / 578 The train loss 0.305231589247
 227 / 578 The train loss 0.306351252967
 228 / 578 The train loss 0.307030298061
 229 / 578 The train loss 0.30784453014
 230 / 578 The train loss 0.308751021892
 231 / 578 The train loss 0.310034771873
 232 / 578 The train loss 0.311466906204
 233 / 578 The train loss 0.312577557011
 234 / 578 The train loss 0.314138728695
 235 / 578 The train loss 0.315147308958
 236 / 578 The train loss 0.31541754872
 237 / 578 The train loss 0.316252085589
 238 / 578 The train loss 0.318567637245
 239 / 578 The train loss 0.320046833977
 240 / 578 The train loss 0.319650150323
 241 / 578 The train loss 0.319124475663
 242 / 578 The train loss 0.319590678131
 243 / 578 The train loss 0.320555416139
 244 / 578 The train loss 0.320502415162
 245 / 578 The train loss 0.320209057097
 246 / 578 The train loss 0.320120676489
 247 / 578 The train loss 0.319176877996
 248 / 578 The train loss 0.318385347258
 249 / 578 The train loss 0.317719129748
 250 / 578 The train loss 0.317827682808
 251 / 578 The train loss 0.318669325164
 252 / 578 The train loss 0.319611285488
 253 / 578 The train loss 0.320899274583
 254 / 578 The train loss 0.32291582328
 255 / 578 The train loss 0.325396010995
 256 / 578 The train loss 0.328097654579
 257 / 578 The train loss 0.331396763035
 258 / 578 The train loss 0.333455483653
 259 / 578 The train loss 0.334782846027
 260 / 578 The train loss 0.336196085882
 261 / 578 The train loss 0.336651173203
 262 / 578 The train loss 0.337399480978
 263 / 578 The train loss 0.33763826943
 264 / 578 The train loss 0.338299273838
 265 / 578 The train loss 0.338740239574
 266 / 578 The train loss 0.339673659477
 267 / 578 The train loss 0.340046180726
 268 / 578 The train loss 0.340508612081
 269 / 578 The train loss 0.342107514757
 270 / 578 The train loss 0.344947584318
 271 / 578 The train loss 0.346444766309
 272 / 578 The train loss 0.346361613737
 273 / 578 The train loss 0.346281671767
 274 / 578 The train loss 0.345923098946
 275 / 578 The train loss 0.345158981694
 276 / 578 The train loss 0.344314554369
 277 / 578 The train loss 0.343271788116
 278 / 578 The train loss 0.342468804486
 279 / 578 The train loss 0.341613521526
 280 / 578 The train loss 0.341119694204
 281 / 578 The train loss 0.341016462185
 282 / 578 The train loss 0.340713738259
 283 / 578 The train loss 0.340437877436
 284 / 578 The train loss 0.340274070691
 285 / 578 The train loss 0.339709093879
 286 / 578 The train loss 0.339288604609
 287 / 578 The train loss 0.339117453982
 288 / 578 The train loss 0.338740805552
 289 / 578 The train loss 0.338217013203
 290 / 578 The train loss 0.338094628654
 291 / 578 The train loss 0.337726199827
 292 / 578 The train loss 0.337161371229
 293 / 578 The train loss 0.33676181492
 294 / 578 The train loss 0.336252323447
 295 / 578 The train loss 0.335597245698
 296 / 578 The train loss 0.335008741256
 297 / 578 The train loss 0.334356753488
 298 / 578 The train loss 0.333647417467
 299 / 578 The train loss 0.333150807008
 300 / 578 The train loss 0.332921046391
 301 / 578 The train loss 0.332923795075
 302 / 578 The train loss 0.333564606501
 303 / 578 The train loss 0.334178719165
 304 / 578 The train loss 0.334320271461
 305 / 578 The train loss 0.334496177
 306 / 578 The train loss 0.334008589217
 307 / 578 The train loss 0.333332073053
 308 / 578 The train loss 0.33302654251
 309 / 578 The train loss 0.332509447094
 310 / 578 The train loss 0.332117684065
 311 / 578 The train loss 0.331951782658
 312 / 578 The train loss 0.332179630629
 313 / 578 The train loss 0.33216936046
 314 / 578 The train loss 0.331597281798
 315 / 578 The train loss 0.33085604854
 316 / 578 The train loss 0.330460626302
 317 / 578 The train loss 0.331099712322
 318 / 578 The train loss 0.331653184486
 319 / 578 The train loss 0.331800083568
 320 / 578 The train loss 0.331629148126
 321 / 578 The train loss 0.33137414811
 322 / 578 The train loss 0.330991879383
 323 / 578 The train loss 0.330539854734
 324 / 578 The train loss 0.329797776669
 325 / 578 The train loss 0.328970743888
 326 / 578 The train loss 0.328373157734
 327 / 578 The train loss 0.327587501206
 328 / 578 The train loss 0.326806581704
 329 / 578 The train loss 0.326128633132
 330 / 578 The train loss 0.325356226968
 331 / 578 The train loss 0.324657103863
 332 / 578 The train loss 0.323963453039
 333 / 578 The train loss 0.323164681437
 334 / 578 The train loss 0.322565026425
 335 / 578 The train loss 0.322160929464
 336 / 578 The train loss 0.321555456191
 337 / 578 The train loss 0.321144177095
 338 / 578 The train loss 0.320704038459
 339 / 578 The train loss 0.320169608106
 340 / 578 The train loss 0.31954942006
 341 / 578 The train loss 0.319046696238
 342 / 578 The train loss 0.318321576135
 343 / 578 The train loss 0.317535662117
 344 / 578 The train loss 0.316837627397
 345 / 578 The train loss 0.316179672467
 346 / 578 The train loss 0.315385755905
 347 / 578 The train loss 0.314691341769
 348 / 578 The train loss 0.314279435288
 349 / 578 The train loss 0.313998400162
 350 / 578 The train loss 0.313406846374
 351 / 578 The train loss 0.312938241049
 352 / 578 The train loss 0.312286298341
 353 / 578 The train loss 0.311788087653
 354 / 578 The train loss 0.311891895103
 355 / 578 The train loss 0.312135889203
 356 / 578 The train loss 0.312275370562
 357 / 578 The train loss 0.312371802388
 358 / 578 The train loss 0.312313166785
 359 / 578 The train loss 0.311972122463
 360 / 578 The train loss 0.311754399497
 361 / 578 The train loss 0.311818793911
 362 / 578 The train loss 0.311809563793
 363 / 578 The train loss 0.311672557443
 364 / 578 The train loss 0.311397923955
 365 / 578 The train loss 0.311001320408
 366 / 578 The train loss 0.310741742533
 367 / 578 The train loss 0.310116138096
 368 / 578 The train loss 0.309371640308
 369 / 578 The train loss 0.308623006421
 370 / 578 The train loss 0.307997569452
 371 / 578 The train loss 0.30746948226
 372 / 578 The train loss 0.306860702182
 373 / 578 The train loss 0.306205017447
 374 / 578 The train loss 0.30562297453
 375 / 578 The train loss 0.304935075253
 376 / 578 The train loss 0.304353615002
 377 / 578 The train loss 0.303664848959
 378 / 578 The train loss 0.302940249197
 379 / 578 The train loss 0.302226177532
 380 / 578 The train loss 0.30159731171
 381 / 578 The train loss 0.301094526176
 382 / 578 The train loss 0.3014217404
 383 / 578 The train loss 0.301814613421
 384 / 578 The train loss 0.301481051012
 385 / 578 The train loss 0.301132922852
 386 / 578 The train loss 0.300551368097
 387 / 578 The train loss 0.300063644663
 388 / 578 The train loss 0.299450537507
 389 / 578 The train loss 0.29884585697
 390 / 578 The train loss 0.298182876437
 391 / 578 The train loss 0.297576979403
 392 / 578 The train loss 0.296987073239
 393 / 578 The train loss 0.296369427971
 394 / 578 The train loss 0.295769795908
 395 / 578 The train loss 0.295326511298
 396 / 578 The train loss 0.294737173931
 397 / 578 The train loss 0.294142930686
 398 / 578 The train loss 0.293495181237
 399 / 578 The train loss 0.292883883424
 400 / 578 The train loss 0.292320715021
 401 / 578 The train loss 0.291764044275
 402 / 578 The train loss 0.291288797016
 403 / 578 The train loss 0.290735506372
 404 / 578 The train loss 0.290274260215
 405 / 578 The train loss 0.289649472901
 406 / 578 The train loss 0.289071328791
 407 / 578 The train loss 0.288492338215
 408 / 578 The train loss 0.288106490137
 409 / 578 The train loss 0.289293418295
 410 / 578 The train loss 0.290982421569
 411 / 578 The train loss 0.291948943669
 412 / 578 The train loss 0.292264756855
 413 / 578 The train loss 0.292770254329
 414 / 578 The train loss 0.293367101869
 415 / 578 The train loss 0.293835660196
 416 / 578 The train loss 0.293737764676
 417 / 578 The train loss 0.293309841341
 418 / 578 The train loss 0.292751508386
 419 / 578 The train loss 0.292260939695
 420 / 578 The train loss 0.291686523466
 421 / 578 The train loss 0.291110013309
 422 / 578 The train loss 0.290487577781
 423 / 578 The train loss 0.290045293391
 424 / 578 The train loss 0.289622499612
 425 / 578 The train loss 0.289410623474
 426 / 578 The train loss 0.289407462323
 427 / 578 The train loss 0.28938642265
 428 / 578 The train loss 0.289203935055
 429 / 578 The train loss 0.288994635306
 430 / 578 The train loss 0.288779128694
 431 / 578 The train loss 0.288283140595
 432 / 578 The train loss 0.287795774505
 433 / 578 The train loss 0.287430727046
 434 / 578 The train loss 0.287256549344
 435 / 578 The train loss 0.287264562993
 436 / 578 The train loss 0.287125907672
 437 / 578 The train loss 0.286671533128
 438 / 578 The train loss 0.28630280063
 439 / 578 The train loss 0.286044694199
 440 / 578 The train loss 0.285829970346
 441 / 578 The train loss 0.285578200881
 442 / 578 The train loss 0.285150695731
 443 / 578 The train loss 0.284764954307
 444 / 578 The train loss 0.284553512381
 445 / 578 The train loss 0.28463715682
 446 / 578 The train loss 0.284844843917
 447 / 578 The train loss 0.285056388539
 448 / 578 The train loss 0.284895940671
 449 / 578 The train loss 0.284507066961
 450 / 578 The train loss 0.284183336087
 451 / 578 The train loss 0.283848033962
 452 / 578 The train loss 0.283550966914
 453 / 578 The train loss 0.283104801616
 454 / 578 The train loss 0.282610866865
 455 / 578 The train loss 0.282133918972
 456 / 578 The train loss 0.281720852925
 457 / 578 The train loss 0.281248409199
 458 / 578 The train loss 0.280698882786
 459 / 578 The train loss 0.280168493593
 460 / 578 The train loss 0.279682056318
 461 / 578 The train loss 0.279184654181
 462 / 578 The train loss 0.278929248666
 463 / 578 The train loss 0.279787316198
 464 / 578 The train loss 0.280303084279
 465 / 578 The train loss 0.280296906945
 466 / 578 The train loss 0.280152681643
 467 / 578 The train loss 0.279728443312
 468 / 578 The train loss 0.279287597869
 469 / 578 The train loss 0.279062868328
 470 / 578 The train loss 0.279015941779
 471 / 578 The train loss 0.278906761734
 472 / 578 The train loss 0.278630484357
 473 / 578 The train loss 0.278233018997
 474 / 578 The train loss 0.277837281791
 475 / 578 The train loss 0.277405499679
 476 / 578 The train loss 0.277071442694
 477 / 578 The train loss 0.276740137109
 478 / 578 The train loss 0.276309009256
 479 / 578 The train loss 0.275919744924
 480 / 578 The train loss 0.27560229246
 481 / 578 The train loss 0.275388324959
 482 / 578 The train loss 0.275240515869
 483 / 578 The train loss 0.274886994077
 484 / 578 The train loss 0.274606080591
 485 / 578 The train loss 0.274375084692
 486 / 578 The train loss 0.274029665558
 487 / 578 The train loss 0.273619282161
 488 / 578 The train loss 0.273274311128
 489 / 578 The train loss 0.272994440609
 490 / 578 The train loss 0.272915975175
 491 / 578 The train loss 0.272898794681
 492 / 578 The train loss 0.272851663486
 493 / 578 The train loss 0.272647292937
 494 / 578 The train loss 0.272314781334
 495 / 578 The train loss 0.272069485625
 496 / 578 The train loss 0.272047839101
 497 / 578 The train loss 0.271969046695
 498 / 578 The train loss 0.271943075773
 499 / 578 The train loss 0.271886107413
 500 / 578 The train loss 0.271697971556
 501 / 578 The train loss 0.271470102011
 502 / 578 The train loss 0.271284041685
 503 / 578 The train loss 0.271033369171
 504 / 578 The train loss 0.270775553404
 505 / 578 The train loss 0.270519881766
 506 / 578 The train loss 0.27032197703
 507 / 578 The train loss 0.270145841931
 508 / 578 The train loss 0.269786725763
 509 / 578 The train loss 0.269356192531
 510 / 578 The train loss 0.268936546728
 511 / 578 The train loss 0.268490506445
 512 / 578 The train loss 0.268146002254
 513 / 578 The train loss 0.267801711604
 514 / 578 The train loss 0.267447451756
 515 / 578 The train loss 0.267133995844
 516 / 578 The train loss 0.267153878642
 517 / 578 The train loss 0.26702460812
 518 / 578 The train loss 0.266797198389
 519 / 578 The train loss 0.266603273359
 520 / 578 The train loss 0.266442148112
 521 / 578 The train loss 0.266243375091
 522 / 578 The train loss 0.265880621381
 523 / 578 The train loss 0.265697657617
 524 / 578 The train loss 0.265543236315
 525 / 578 The train loss 0.265336630327
 526 / 578 The train loss 0.265170555558
 527 / 578 The train loss 0.265095308292
 528 / 578 The train loss 0.264909649702
 529 / 578 The train loss 0.264761699941
 530 / 578 The train loss 0.264767488482
 531 / 578 The train loss 0.264745447016
 532 / 578 The train loss 0.264784871842
 533 / 578 The train loss 0.264713477053
 534 / 578 The train loss 0.264464431135
 535 / 578 The train loss 0.264126362182
 536 / 578 The train loss 0.263765178437
 537 / 578 The train loss 0.263529065889
 538 / 578 The train loss 0.263442086779
 539 / 578 The train loss 0.263129524745
 540 / 578 The train loss 0.262720279102
 541 / 578 The train loss 0.262500971924
 542 / 578 The train loss 0.262380618984
 543 / 578 The train loss 0.262043836048
 544 / 578 The train loss 0.261742725595
 545 / 578 The train loss 0.261751340846
 546 / 578 The train loss 0.26287914426
 547 / 578 The train loss 0.264369569171
 548 / 578 The train loss 0.266997616993
 549 / 578 The train loss 0.268220471673
 550 / 578 The train loss 0.268670710227
 551 / 578 The train loss 0.269404512795
 552 / 578 The train loss 0.269951977818
 553 / 578 The train loss 0.270430174496
 554 / 578 The train loss 0.27074248767
 555 / 578 The train loss 0.270861770182
 556 / 578 The train loss 0.271193297979
 557 / 578 The train loss 0.271941708822
 558 / 578 The train loss 0.272593296246
 559 / 578 The train loss 0.272966192702
 560 / 578 The train loss 0.273374647497
 561 / 578 The train loss 0.273387907033
 562 / 578 The train loss 0.273050507428
 563 / 578 The train loss 0.273061139569
 564 / 578 The train loss 0.273685359597
 565 / 578 The train loss 0.27384814828
 566 / 578 The train loss 0.273692480163
 567 / 578 The train loss 0.273359850019
 568 / 578 The train loss 0.273066324008
 569 / 578 The train loss 0.272748647415
 570 / 578 The train loss 0.272343088899
 571 / 578 The train loss 0.272113926803
 572 / 578 The train loss 0.271770011116
 573 / 578 The train loss 0.271674724361
 574 / 578 The train loss 0.271500290975
 575 / 578 The train loss 0.271181124387
 576 / 578 The train loss 0.271093949304
 577 / 578 The train loss 0.270969134828
 578 / 578 The train loss 0.270820517174

Starting epoch 16
Validation:
 1 / 30 The valid loss 0.417902141809
 2 / 30 The valid loss 0.401053279638
 3 / 30 The valid loss 0.364517331123
 4 / 30 The valid loss 0.292307499796
 5 / 30 The valid loss 0.246447524428
 6 / 30 The valid loss 0.258626726766
 7 / 30 The valid loss 0.345412107451
 8 / 30 The valid loss 0.475403806195
 9 / 30 The valid loss 0.609615026249
 10 / 30 The valid loss 0.680645863712
 11 / 30 The valid loss 0.707709744573
 12 / 30 The valid loss 0.72032796219
 13 / 30 The valid loss 0.727850743211
 14 / 30 The valid loss 0.733771012298
 15 / 30 The valid loss 0.723980640372
 16 / 30 The valid loss 0.721818624996
 17 / 30 The valid loss 0.736480315819
 18 / 30 The valid loss 0.747327230871
 19 / 30 The valid loss 0.748518535181
 20 / 30 The valid loss 0.745627453178
 21 / 30 The valid loss 0.729933114988
 22 / 30 The valid loss 0.71329885518
 23 / 30 The valid loss 0.699327437126
 24 / 30 The valid loss 0.687945139284
 25 / 30 The valid loss 0.67373559773
 26 / 30 The valid loss 0.663457393073
 27 / 30 The valid loss 0.649255159277
 28 / 30 The valid loss 0.632549384343
 29 / 30 The valid loss 0.617529968249
 30 / 30 The valid loss 0.606414295733

Validation MSE(val_loss): 12.8607292302

Test MSE(test_loss): 2.41759893091
Training
 1 / 578 The train loss 0.242021888494
 2 / 578 The train loss 0.14642803371
 3 / 578 The train loss 0.131260876854
 4 / 578 The train loss 0.146088775247
 5 / 578 The train loss 0.140045696497
 6 / 578 The train loss 0.138266103963
 7 / 578 The train loss 0.141008472868
 8 / 578 The train loss 0.132472227328
 9 / 578 The train loss 0.123641769919
 10 / 578 The train loss 0.117567428201
 11 / 578 The train loss 0.111605083739
 12 / 578 The train loss 0.106647755019
 13 / 578 The train loss 0.102486381737
 14 / 578 The train loss 0.10340030651
 15 / 578 The train loss 0.103573599954
 16 / 578 The train loss 0.105502670165
 17 / 578 The train loss 0.108974782421
 18 / 578 The train loss 0.117430009362
 19 / 578 The train loss 0.119847819601
 20 / 578 The train loss 0.117780746892
 21 / 578 The train loss 0.114336064351
 22 / 578 The train loss 0.110545586033
 23 / 578 The train loss 0.108059001601
 24 / 578 The train loss 0.105928661845
 25 / 578 The train loss 0.105763519332
 26 / 578 The train loss 0.104341669366
 27 / 578 The train loss 0.103530270841
 28 / 578 The train loss 0.10184130811
 29 / 578 The train loss 0.0992267567921
 30 / 578 The train loss 0.0971698531881
 31 / 578 The train loss 0.0949742721334
 32 / 578 The train loss 0.0923912279541
 33 / 578 The train loss 0.09032662258
 34 / 578 The train loss 0.0888821416918
 35 / 578 The train loss 0.0882449145828
 36 / 578 The train loss 0.0913385264575
 37 / 578 The train loss 0.0932336981232
 38 / 578 The train loss 0.0934529069223
 39 / 578 The train loss 0.0928905675044
 40 / 578 The train loss 0.0919361276552
 41 / 578 The train loss 0.0900651003666
 42 / 578 The train loss 0.0885279485291
 43 / 578 The train loss 0.0874400662778
 44 / 578 The train loss 0.0860183209138
 45 / 578 The train loss 0.0849098507522
 46 / 578 The train loss 0.0843454123799
 47 / 578 The train loss 0.0836243772721
 48 / 578 The train loss 0.0838957497811
 49 / 578 The train loss 0.0834106403612
 50 / 578 The train loss 0.0829098003171
 51 / 578 The train loss 0.082137301619
 52 / 578 The train loss 0.082394518585
 53 / 578 The train loss 0.0824839504225
 54 / 578 The train loss 0.0818492300912
 55 / 578 The train loss 0.081221235662
 56 / 578 The train loss 0.0802316761276
 57 / 578 The train loss 0.0801006819432
 58 / 578 The train loss 0.0800484112978
 59 / 578 The train loss 0.0810298823546
 60 / 578 The train loss 0.0824903577411
 61 / 578 The train loss 0.0870095066207
 62 / 578 The train loss 0.0929819931757
 63 / 578 The train loss 0.0976271927209
 64 / 578 The train loss 0.100160147049
 65 / 578 The train loss 0.101009002543
 66 / 578 The train loss 0.10121829637
 67 / 578 The train loss 0.100780089719
 68 / 578 The train loss 0.100292456706
 69 / 578 The train loss 0.0996543858335
 70 / 578 The train loss 0.0990606386081
 71 / 578 The train loss 0.0987372141672
 72 / 578 The train loss 0.0976402538492
 73 / 578 The train loss 0.0970239177831
 74 / 578 The train loss 0.0987970412054
 75 / 578 The train loss 0.102810359187
 76 / 578 The train loss 0.106290931164
 77 / 578 The train loss 0.10738880486
 78 / 578 The train loss 0.108695674389
 79 / 578 The train loss 0.111830752095
 80 / 578 The train loss 0.113664628926
 81 / 578 The train loss 0.115387543105
 82 / 578 The train loss 0.116112195671
 83 / 578 The train loss 0.117062631669
 84 / 578 The train loss 0.11838344785
 85 / 578 The train loss 0.119865163624
 86 / 578 The train loss 0.120596346629
 87 / 578 The train loss 0.120624631119
 88 / 578 The train loss 0.120832581909
 89 / 578 The train loss 0.124345859607
 90 / 578 The train loss 0.1257531363
 91 / 578 The train loss 0.124818218104
 92 / 578 The train loss 0.124582772983
 93 / 578 The train loss 0.124482796227
 94 / 578 The train loss 0.124325882713
 95 / 578 The train loss 0.124614071993
 96 / 578 The train loss 0.124461493445
 97 / 578 The train loss 0.124858524986
 98 / 578 The train loss 0.125071720952
 99 / 578 The train loss 0.12573559506
 100 / 578 The train loss 0.12684719055
 101 / 578 The train loss 0.128059472474
 102 / 578 The train loss 0.12817618659
 103 / 578 The train loss 0.128030402993
 104 / 578 The train loss 0.128289073553
 105 / 578 The train loss 0.127989866824
 106 / 578 The train loss 0.128003171935
 107 / 578 The train loss 0.128117080386
 108 / 578 The train loss 0.127614731512
 109 / 578 The train loss 0.126833355746
 110 / 578 The train loss 0.126423281626
 111 / 578 The train loss 0.126554241916
 112 / 578 The train loss 0.127328116505
 113 / 578 The train loss 0.12896609022
 114 / 578 The train loss 0.130808051275
 115 / 578 The train loss 0.132856828031
 116 / 578 The train loss 0.13527521805
 117 / 578 The train loss 0.137141447092
 118 / 578 The train loss 0.139780287828
 119 / 578 The train loss 0.142752251009
 120 / 578 The train loss 0.14540021574
 121 / 578 The train loss 0.148111680376
 122 / 578 The train loss 0.151002628446
 123 / 578 The train loss 0.156072034152
 124 / 578 The train loss 0.163681050515
 125 / 578 The train loss 0.170724810652
 126 / 578 The train loss 0.176425576735
 127 / 578 The train loss 0.182317083797
 128 / 578 The train loss 0.188294907617
 129 / 578 The train loss 0.194291602322
 130 / 578 The train loss 0.200053073401
 131 / 578 The train loss 0.204817694157
 132 / 578 The train loss 0.206409272872
 133 / 578 The train loss 0.206277354368
 134 / 578 The train loss 0.206152163828
 135 / 578 The train loss 0.20656638371
 136 / 578 The train loss 0.207612137413
 137 / 578 The train loss 0.208988984183
 138 / 578 The train loss 0.209232031914
 139 / 578 The train loss 0.209587747935
 140 / 578 The train loss 0.209463470582
 141 / 578 The train loss 0.20967184667
 142 / 578 The train loss 0.209348109243
 143 / 578 The train loss 0.208750145412
 144 / 578 The train loss 0.208839191221
 145 / 578 The train loss 0.208857038405
 146 / 578 The train loss 0.209719180544
 147 / 578 The train loss 0.210888376542
 148 / 578 The train loss 0.211394542268
 149 / 578 The train loss 0.212339495546
 150 / 578 The train loss 0.212553484632
 151 / 578 The train loss 0.211769455803
 152 / 578 The train loss 0.212047242112
 153 / 578 The train loss 0.211679517632
 154 / 578 The train loss 0.211192856742
 155 / 578 The train loss 0.211442185788
 156 / 578 The train loss 0.2114450878
 157 / 578 The train loss 0.211507286515
 158 / 578 The train loss 0.211439930274
 159 / 578 The train loss 0.210805663517
 160 / 578 The train loss 0.210386136541
 161 / 578 The train loss 0.209666608601
 162 / 578 The train loss 0.20867176673
 163 / 578 The train loss 0.207928064291
 164 / 578 The train loss 0.207306126286
 165 / 578 The train loss 0.206519876754
 166 / 578 The train loss 0.205780597174
 167 / 578 The train loss 0.204987138616
 168 / 578 The train loss 0.207967615126
 169 / 578 The train loss 0.210952502312
 170 / 578 The train loss 0.212928185322
 171 / 578 The train loss 0.21460885883
 172 / 578 The train loss 0.215175770673
 173 / 578 The train loss 0.217138133227
 174 / 578 The train loss 0.21903517116
 175 / 578 The train loss 0.220236511161
 176 / 578 The train loss 0.221788287602
 177 / 578 The train loss 0.222695951845
 178 / 578 The train loss 0.222756846332
 179 / 578 The train loss 0.222977822152
 180 / 578 The train loss 0.224459690119
 181 / 578 The train loss 0.224634245547
 182 / 578 The train loss 0.22387847965
 183 / 578 The train loss 0.223510023405
 184 / 578 The train loss 0.22525103331
 185 / 578 The train loss 0.228072554412
 186 / 578 The train loss 0.228323565591
 187 / 578 The train loss 0.228965951374
 188 / 578 The train loss 0.232840115933
 189 / 578 The train loss 0.237344484062
 190 / 578 The train loss 0.242586474253
 191 / 578 The train loss 0.24406370526
 192 / 578 The train loss 0.245638320121
 193 / 578 The train loss 0.247185530339
 194 / 578 The train loss 0.248850549518
 195 / 578 The train loss 0.249686769697
 196 / 578 The train loss 0.252774478578
 197 / 578 The train loss 0.254505014714
 198 / 578 The train loss 0.25673714496
 199 / 578 The train loss 0.263593516809
 200 / 578 The train loss 0.270172116212
 201 / 578 The train loss 0.274483557415
 202 / 578 The train loss 0.278651990564
 203 / 578 The train loss 0.281133258334
 204 / 578 The train loss 0.282297842109
 205 / 578 The train loss 0.28178056687
 206 / 578 The train loss 0.281179486959
 207 / 578 The train loss 0.280814544424
 208 / 578 The train loss 0.280367353837
 209 / 578 The train loss 0.280691172281
 210 / 578 The train loss 0.280257416175
 211 / 578 The train loss 0.279803392566
 212 / 578 The train loss 0.279218268822
 213 / 578 The train loss 0.278726970134
 214 / 578 The train loss 0.278509642315
 215 / 578 The train loss 0.278468173305
 216 / 578 The train loss 0.278225990957
 217 / 578 The train loss 0.278269197856
 218 / 578 The train loss 0.278047934905
 219 / 578 The train loss 0.277715630978
 220 / 578 The train loss 0.277304761429
 221 / 578 The train loss 0.276984691262
 222 / 578 The train loss 0.276355244735
 223 / 578 The train loss 0.276200943109
 224 / 578 The train loss 0.276079886439
 225 / 578 The train loss 0.276824009398
 226 / 578 The train loss 0.277702093236
 227 / 578 The train loss 0.278902717997
 228 / 578 The train loss 0.280287386927
 229 / 578 The train loss 0.280862002764
 230 / 578 The train loss 0.281139783087
 231 / 578 The train loss 0.281847538092
 232 / 578 The train loss 0.282754570483
 233 / 578 The train loss 0.283687494835
 234 / 578 The train loss 0.284907169556
 235 / 578 The train loss 0.285854954649
 236 / 578 The train loss 0.285960222054
 237 / 578 The train loss 0.286469830345
 238 / 578 The train loss 0.288076556135
 239 / 578 The train loss 0.288422371409
 240 / 578 The train loss 0.288099496358
 241 / 578 The train loss 0.287674415327
 242 / 578 The train loss 0.28760447733
 243 / 578 The train loss 0.287918433037
 244 / 578 The train loss 0.288396981938
 245 / 578 The train loss 0.289618978007
 246 / 578 The train loss 0.290425445654
 247 / 578 The train loss 0.289623333054
 248 / 578 The train loss 0.289273524638
 249 / 578 The train loss 0.288998786267
 250 / 578 The train loss 0.289261277597
 251 / 578 The train loss 0.28958220873
 252 / 578 The train loss 0.290070281065
 253 / 578 The train loss 0.290954536523
 254 / 578 The train loss 0.293000349771
 255 / 578 The train loss 0.295392806781
 256 / 578 The train loss 0.298967214225
 257 / 578 The train loss 0.302904434165
 258 / 578 The train loss 0.305295741528
 259 / 578 The train loss 0.305964603887
 260 / 578 The train loss 0.306160791143
 261 / 578 The train loss 0.306110785535
 262 / 578 The train loss 0.306708398819
 263 / 578 The train loss 0.30677537131
 264 / 578 The train loss 0.306847595846
 265 / 578 The train loss 0.30697850094
 266 / 578 The train loss 0.307683491096
 267 / 578 The train loss 0.307787687741
 268 / 578 The train loss 0.307589351057
 269 / 578 The train loss 0.308080953774
 270 / 578 The train loss 0.308768157644
 271 / 578 The train loss 0.309061787373
 272 / 578 The train loss 0.308975176485
 273 / 578 The train loss 0.308414861215
 274 / 578 The train loss 0.307734403198
 275 / 578 The train loss 0.307129764946
 276 / 578 The train loss 0.306614079833
 277 / 578 The train loss 0.305912441682
 278 / 578 The train loss 0.305105563424
 279 / 578 The train loss 0.304333744013
 280 / 578 The train loss 0.304140433386
 281 / 578 The train loss 0.303981493519
 282 / 578 The train loss 0.303501004746
 283 / 578 The train loss 0.304598013087
 284 / 578 The train loss 0.305337401641
 285 / 578 The train loss 0.305027285986
 286 / 578 The train loss 0.304832748531
 287 / 578 The train loss 0.304674454303
 288 / 578 The train loss 0.304502313682
 289 / 578 The train loss 0.304392632661
 290 / 578 The train loss 0.304474676821
 291 / 578 The train loss 0.304319037947
 292 / 578 The train loss 0.304043465367
 293 / 578 The train loss 0.303876526918
 294 / 578 The train loss 0.303418665754
 295 / 578 The train loss 0.303037892039
 296 / 578 The train loss 0.302314961213
 297 / 578 The train loss 0.301963378468
 298 / 578 The train loss 0.301595216414
 299 / 578 The train loss 0.301125316451
 300 / 578 The train loss 0.301276018356
 301 / 578 The train loss 0.301864098903
 302 / 578 The train loss 0.302493182569
 303 / 578 The train loss 0.30295595835
 304 / 578 The train loss 0.30334178744
 305 / 578 The train loss 0.303276410586
 306 / 578 The train loss 0.30290364755
 307 / 578 The train loss 0.302515362329
 308 / 578 The train loss 0.301917002558
 309 / 578 The train loss 0.301426768095
 310 / 578 The train loss 0.301075845221
 311 / 578 The train loss 0.302176176001
 312 / 578 The train loss 0.303680359777
 313 / 578 The train loss 0.305022430338
 314 / 578 The train loss 0.306775014714
 315 / 578 The train loss 0.307084805468
 316 / 578 The train loss 0.306645851237
 317 / 578 The train loss 0.306131473225
 318 / 578 The train loss 0.305510806503
 319 / 578 The train loss 0.3052716586
 320 / 578 The train loss 0.304750101882
 321 / 578 The train loss 0.303905296696
 322 / 578 The train loss 0.303145923099
 323 / 578 The train loss 0.302509798778
 324 / 578 The train loss 0.30182271155
 325 / 578 The train loss 0.301065242723
 326 / 578 The train loss 0.300278115066
 327 / 578 The train loss 0.29959607827
 328 / 578 The train loss 0.298868166416
 329 / 578 The train loss 0.298107676715
 330 / 578 The train loss 0.297380487234
 331 / 578 The train loss 0.296652295751
 332 / 578 The train loss 0.295851198639
 333 / 578 The train loss 0.295128941466
 334 / 578 The train loss 0.294381054026
 335 / 578 The train loss 0.293724711203
 336 / 578 The train loss 0.293120091834
 337 / 578 The train loss 0.292750669746
 338 / 578 The train loss 0.292507398375
 339 / 578 The train loss 0.292105093035
 340 / 578 The train loss 0.29156866436
 341 / 578 The train loss 0.291001169527
 342 / 578 The train loss 0.290642543808
 343 / 578 The train loss 0.29006297961
 344 / 578 The train loss 0.289496038919
 345 / 578 The train loss 0.288990168433
 346 / 578 The train loss 0.288360072319
 347 / 578 The train loss 0.287851391921
 348 / 578 The train loss 0.287688917628
 349 / 578 The train loss 0.287759007718
 350 / 578 The train loss 0.287575030186
 351 / 578 The train loss 0.287211388251
 352 / 578 The train loss 0.286694156458
 353 / 578 The train loss 0.286226349651
 354 / 578 The train loss 0.286101248881
 355 / 578 The train loss 0.286467506671
 356 / 578 The train loss 0.286466303823
 357 / 578 The train loss 0.286189519586
 358 / 578 The train loss 0.285941090609
 359 / 578 The train loss 0.285790317017
 360 / 578 The train loss 0.285957792264
 361 / 578 The train loss 0.286280953351
 362 / 578 The train loss 0.286465352934
 363 / 578 The train loss 0.286322819082
 364 / 578 The train loss 0.28606108452
 365 / 578 The train loss 0.285873434955
 366 / 578 The train loss 0.285556621511
 367 / 578 The train loss 0.285261425347
 368 / 578 The train loss 0.284829041786
 369 / 578 The train loss 0.284170318958
 370 / 578 The train loss 0.283533908112
 371 / 578 The train loss 0.282900799686
 372 / 578 The train loss 0.282304669778
 373 / 578 The train loss 0.281705171367
 374 / 578 The train loss 0.281097146157
 375 / 578 The train loss 0.280539878684
 376 / 578 The train loss 0.279882722453
 377 / 578 The train loss 0.279245536288
 378 / 578 The train loss 0.278583650652
 379 / 578 The train loss 0.278011112091
 380 / 578 The train loss 0.277516104813
 381 / 578 The train loss 0.277074718083
 382 / 578 The train loss 0.276782834232
 383 / 578 The train loss 0.276736663603
 384 / 578 The train loss 0.276489117942
 385 / 578 The train loss 0.276087330691
 386 / 578 The train loss 0.27553165436
 387 / 578 The train loss 0.275163325943
 388 / 578 The train loss 0.274888015499
 389 / 578 The train loss 0.274378097133
 390 / 578 The train loss 0.273907753694
 391 / 578 The train loss 0.273396438633
 392 / 578 The train loss 0.272833388672
 393 / 578 The train loss 0.272237996083
 394 / 578 The train loss 0.271604176618
 395 / 578 The train loss 0.271019612345
 396 / 578 The train loss 0.270711013575
 397 / 578 The train loss 0.270345799604
 398 / 578 The train loss 0.269944181598
 399 / 578 The train loss 0.269363678261
 400 / 578 The train loss 0.26896474093
 401 / 578 The train loss 0.26870639975
 402 / 578 The train loss 0.26821021059
 403 / 578 The train loss 0.267669675477
 404 / 578 The train loss 0.26710113869
 405 / 578 The train loss 0.266699340284
 406 / 578 The train loss 0.266548072587
 407 / 578 The train loss 0.266388095208
 408 / 578 The train loss 0.265907660024
 409 / 578 The train loss 0.26671789988
 410 / 578 The train loss 0.268094885247
 411 / 578 The train loss 0.269278445281
 412 / 578 The train loss 0.270084742391
 413 / 578 The train loss 0.270328350606
 414 / 578 The train loss 0.271113758113
 415 / 578 The train loss 0.271554684035
 416 / 578 The train loss 0.271111895946
 417 / 578 The train loss 0.270529434217
 418 / 578 The train loss 0.270021273917
 419 / 578 The train loss 0.26964483433
 420 / 578 The train loss 0.269258119057
 421 / 578 The train loss 0.268730496406
 422 / 578 The train loss 0.268148363747
 423 / 578 The train loss 0.267570794648
 424 / 578 The train loss 0.267154915766
 425 / 578 The train loss 0.266955761224
 426 / 578 The train loss 0.266778218125
 427 / 578 The train loss 0.266874448806
 428 / 578 The train loss 0.267021209957
 429 / 578 The train loss 0.266808766809
 430 / 578 The train loss 0.266941399991
 431 / 578 The train loss 0.267028168931
 432 / 578 The train loss 0.266924245781
 433 / 578 The train loss 0.267242416995
 434 / 578 The train loss 0.268913855422
 435 / 578 The train loss 0.270452770229
 436 / 578 The train loss 0.271274342259
 437 / 578 The train loss 0.271375719498
 438 / 578 The train loss 0.271040855832
 439 / 578 The train loss 0.270489211114
 440 / 578 The train loss 0.269920036648
 441 / 578 The train loss 0.269432629742
 442 / 578 The train loss 0.268884187137
 443 / 578 The train loss 0.268304968553
 444 / 578 The train loss 0.267995124123
 445 / 578 The train loss 0.268164266556
 446 / 578 The train loss 0.268786792324
 447 / 578 The train loss 0.269003583608
 448 / 578 The train loss 0.269326988801
 449 / 578 The train loss 0.26961459096
 450 / 578 The train loss 0.269565518956
 451 / 578 The train loss 0.269658933848
 452 / 578 The train loss 0.270104831754
 453 / 578 The train loss 0.270263914761
 454 / 578 The train loss 0.270304342914
 455 / 578 The train loss 0.27029271302
 456 / 578 The train loss 0.269990056518
 457 / 578 The train loss 0.269684732704
 458 / 578 The train loss 0.269225961738
 459 / 578 The train loss 0.26872579252
 460 / 578 The train loss 0.268352795734
 461 / 578 The train loss 0.267931266458
 462 / 578 The train loss 0.267653741203
 463 / 578 The train loss 0.268255125082
 464 / 578 The train loss 0.268852833243
 465 / 578 The train loss 0.269161312012
 466 / 578 The train loss 0.268998340829
 467 / 578 The train loss 0.2687223248
 468 / 578 The train loss 0.268502578969
 469 / 578 The train loss 0.268277743416
 470 / 578 The train loss 0.268033052444
 471 / 578 The train loss 0.267902947221
 472 / 578 The train loss 0.267641917189
 473 / 578 The train loss 0.267281527737
 474 / 578 The train loss 0.266819147112
 475 / 578 The train loss 0.266336308571
 476 / 578 The train loss 0.265962914425
 477 / 578 The train loss 0.265697223376
 478 / 578 The train loss 0.265249950096
 479 / 578 The train loss 0.264787179414
 480 / 578 The train loss 0.264438237097
 481 / 578 The train loss 0.264167263517
 482 / 578 The train loss 0.263868706481
 483 / 578 The train loss 0.263522869529
 484 / 578 The train loss 0.263144618318
 485 / 578 The train loss 0.26277260143
 486 / 578 The train loss 0.26239245287
 487 / 578 The train loss 0.262017685547
 488 / 578 The train loss 0.261734876097
 489 / 578 The train loss 0.261483346897
 490 / 578 The train loss 0.261495777678
 491 / 578 The train loss 0.261525364082
 492 / 578 The train loss 0.261308939433
 493 / 578 The train loss 0.261305425114
 494 / 578 The train loss 0.261096623672
 495 / 578 The train loss 0.26087609179
 496 / 578 The train loss 0.260767720056
 497 / 578 The train loss 0.260835007719
 498 / 578 The train loss 0.260780224825
 499 / 578 The train loss 0.260542402303
 500 / 578 The train loss 0.260238750862
 501 / 578 The train loss 0.259889781819
 502 / 578 The train loss 0.259445914103
 503 / 578 The train loss 0.259046296893
 504 / 578 The train loss 0.258770175369
 505 / 578 The train loss 0.258453812194
 506 / 578 The train loss 0.258087925085
 507 / 578 The train loss 0.257725815935
 508 / 578 The train loss 0.257352101112
 509 / 578 The train loss 0.256982163343
 510 / 578 The train loss 0.256689189099
 511 / 578 The train loss 0.256368116581
 512 / 578 The train loss 0.256046465673
 513 / 578 The train loss 0.255758355223
 514 / 578 The train loss 0.25541337945
 515 / 578 The train loss 0.255012450304
 516 / 578 The train loss 0.254819895251
 517 / 578 The train loss 0.254702164557
 518 / 578 The train loss 0.254620612834
 519 / 578 The train loss 0.254595665762
 520 / 578 The train loss 0.254771772158
 521 / 578 The train loss 0.254873735049
 522 / 578 The train loss 0.254574357138
 523 / 578 The train loss 0.254344353985
 524 / 578 The train loss 0.254191164796
 525 / 578 The train loss 0.253990180991
 526 / 578 The train loss 0.253688989613
 527 / 578 The train loss 0.253450923914
 528 / 578 The train loss 0.253358872849
 529 / 578 The train loss 0.25332494138
 530 / 578 The train loss 0.25320185461
 531 / 578 The train loss 0.253026829224
 532 / 578 The train loss 0.253000728102
 533 / 578 The train loss 0.252976473947
 534 / 578 The train loss 0.252709110552
 535 / 578 The train loss 0.252414878703
 536 / 578 The train loss 0.252067163185
 537 / 578 The train loss 0.251856939685
 538 / 578 The train loss 0.251702853338
 539 / 578 The train loss 0.251453573373
 540 / 578 The train loss 0.251160830592
 541 / 578 The train loss 0.250971641455
 542 / 578 The train loss 0.250772339888
 543 / 578 The train loss 0.250495532052
 544 / 578 The train loss 0.250192396112
 545 / 578 The train loss 0.250195233854
 546 / 578 The train loss 0.251101256387
 547 / 578 The train loss 0.252066861329
 548 / 578 The train loss 0.254516189321
 549 / 578 The train loss 0.255902124221
 550 / 578 The train loss 0.256430259606
 551 / 578 The train loss 0.256760145585
 552 / 578 The train loss 0.256811306787
 553 / 578 The train loss 0.256993479409
 554 / 578 The train loss 0.257118985314
 555 / 578 The train loss 0.256965632282
 556 / 578 The train loss 0.256876912331
 557 / 578 The train loss 0.256570631035
 558 / 578 The train loss 0.256428457037
 559 / 578 The train loss 0.256318199175
 560 / 578 The train loss 0.256139482227
 561 / 578 The train loss 0.256094353976
 562 / 578 The train loss 0.256044198333
 563 / 578 The train loss 0.255923686494
 564 / 578 The train loss 0.255663250449
 565 / 578 The train loss 0.255303823185
 566 / 578 The train loss 0.254959687579
 567 / 578 The train loss 0.254718272342
 568 / 578 The train loss 0.254476439249
 569 / 578 The train loss 0.25422181112
 570 / 578 The train loss 0.254076307164
 571 / 578 The train loss 0.25373655839
 572 / 578 The train loss 0.253535054463
 573 / 578 The train loss 0.253169096156
 574 / 578 The train loss 0.252892432022
 575 / 578 The train loss 0.252585404912
 576 / 578 The train loss 0.252867155665
 577 / 578 The train loss 0.253496305635
 578 / 578 The train loss 0.254236062256

Starting epoch 17
Validation:
 1 / 30 The valid loss 0.456449568272
 2 / 30 The valid loss 0.446170970798
 3 / 30 The valid loss 0.404537717501
 4 / 30 The valid loss 0.319109389558
 5 / 30 The valid loss 0.265451798588
 6 / 30 The valid loss 0.282125351951
 7 / 30 The valid loss 0.376017126122
 8 / 30 The valid loss 0.520632533822
 9 / 30 The valid loss 0.669263759007
 10 / 30 The valid loss 0.742962669209
 11 / 30 The valid loss 0.794059438123
 12 / 30 The valid loss 0.827145545743
 13 / 30 The valid loss 0.847023614611
 14 / 30 The valid loss 0.865216228845
 15 / 30 The valid loss 0.858403443545
 16 / 30 The valid loss 0.856127380626
 17 / 30 The valid loss 0.873541768202
 18 / 30 The valid loss 0.887138856161
 19 / 30 The valid loss 0.891285914536
 20 / 30 The valid loss 0.887925305404
 21 / 30 The valid loss 0.869772672121
 22 / 30 The valid loss 0.850275239653
 23 / 30 The valid loss 0.834062012155
 24 / 30 The valid loss 0.819509643596
 25 / 30 The valid loss 0.801795058995
 26 / 30 The valid loss 0.788870098069
 27 / 30 The valid loss 0.769206544453
 28 / 30 The valid loss 0.747561574115
 29 / 30 The valid loss 0.729547003862
 30 / 30 The valid loss 0.717483458544

Training
 1 / 578 The train loss 0.403209894896
 2 / 578 The train loss 0.266141474247
 3 / 578 The train loss 0.247382019957
 4 / 578 The train loss 0.278980705887
 5 / 578 The train loss 0.262363809347
 6 / 578 The train loss 0.234571763625
 7 / 578 The train loss 0.217441980328
 8 / 578 The train loss 0.199948566966
 9 / 578 The train loss 0.181906540775
 10 / 578 The train loss 0.167620457336
 11 / 578 The train loss 0.155764657327
 12 / 578 The train loss 0.146712987063
 13 / 578 The train loss 0.139139375721
 14 / 578 The train loss 0.135345420401
 15 / 578 The train loss 0.1291503643
 16 / 578 The train loss 0.127293118741
 17 / 578 The train loss 0.130194914253
 18 / 578 The train loss 0.130320875595
 19 / 578 The train loss 0.130525737609
 20 / 578 The train loss 0.127601933852
 21 / 578 The train loss 0.123050117422
 22 / 578 The train loss 0.119421970946
 23 / 578 The train loss 0.116979156337
 24 / 578 The train loss 0.114878016679
 25 / 578 The train loss 0.113421343416
 26 / 578 The train loss 0.111487225128
 27 / 578 The train loss 0.109196164128
 28 / 578 The train loss 0.106723963416
 29 / 578 The train loss 0.104290663554
 30 / 578 The train loss 0.10139455876
 31 / 578 The train loss 0.0987560733432
 32 / 578 The train loss 0.0964966252213
 33 / 578 The train loss 0.0939633373566
 34 / 578 The train loss 0.0919105576297
 35 / 578 The train loss 0.0908870766472
 36 / 578 The train loss 0.0916076564592
 37 / 578 The train loss 0.091366679609
 38 / 578 The train loss 0.0911269938132
 39 / 578 The train loss 0.0900972176295
 40 / 578 The train loss 0.0888790221186
 41 / 578 The train loss 0.0875083229209
 42 / 578 The train loss 0.0859165286778
 43 / 578 The train loss 0.0843038349088
 44 / 578 The train loss 0.0830054011577
 45 / 578 The train loss 0.0817231540258
 46 / 578 The train loss 0.0802784329078
 47 / 578 The train loss 0.0793603444076
 48 / 578 The train loss 0.0782005089374
 49 / 578 The train loss 0.0776371499904
 50 / 578 The train loss 0.0774124069698
 51 / 578 The train loss 0.0774508200622
 52 / 578 The train loss 0.078671107803
 53 / 578 The train loss 0.0782562199945
 54 / 578 The train loss 0.0773834053048
 55 / 578 The train loss 0.0765451211313
 56 / 578 The train loss 0.0754579370342
 57 / 578 The train loss 0.0750898377839
 58 / 578 The train loss 0.0754570307306
 59 / 578 The train loss 0.0765034307773
 60 / 578 The train loss 0.0787997699187
 61 / 578 The train loss 0.0831822309735
 62 / 578 The train loss 0.0893260640873
 63 / 578 The train loss 0.0924514500306
 64 / 578 The train loss 0.0940592045808
 65 / 578 The train loss 0.0943636285714
 66 / 578 The train loss 0.0940285773355
 67 / 578 The train loss 0.0933928205145
 68 / 578 The train loss 0.092971413903
 69 / 578 The train loss 0.0920806686002
 70 / 578 The train loss 0.0916851940033
 71 / 578 The train loss 0.0926121051594
 72 / 578 The train loss 0.0932744963017
 73 / 578 The train loss 0.0939112358841
 74 / 578 The train loss 0.0973755230095
 75 / 578 The train loss 0.103030530227
 76 / 578 The train loss 0.10634124594
 77 / 578 The train loss 0.107453422899
 78 / 578 The train loss 0.108609999996
 79 / 578 The train loss 0.108970544207
 80 / 578 The train loss 0.109697449778
 81 / 578 The train loss 0.110710751139
 82 / 578 The train loss 0.111767607684
 83 / 578 The train loss 0.113436648835
 84 / 578 The train loss 0.114896976356
 85 / 578 The train loss 0.117698234155
 86 / 578 The train loss 0.120472997265
 87 / 578 The train loss 0.12280239714
 88 / 578 The train loss 0.124872189534
 89 / 578 The train loss 0.129653337474
 90 / 578 The train loss 0.132712437554
 91 / 578 The train loss 0.132659262036
 92 / 578 The train loss 0.132445430555
 93 / 578 The train loss 0.131758692774
 94 / 578 The train loss 0.130973635727
 95 / 578 The train loss 0.1303507883
 96 / 578 The train loss 0.129682625382
 97 / 578 The train loss 0.129026554573
 98 / 578 The train loss 0.128546655378
 99 / 578 The train loss 0.128824859152
 100 / 578 The train loss 0.129414828764
 101 / 578 The train loss 0.129424014645
 102 / 578 The train loss 0.129039656855
 103 / 578 The train loss 0.128648332284
 104 / 578 The train loss 0.127759275752
 105 / 578 The train loss 0.12695583883
 106 / 578 The train loss 0.12639580573
 107 / 578 The train loss 0.125927005041
 108 / 578 The train loss 0.125573861283
 109 / 578 The train loss 0.125479302523
 110 / 578 The train loss 0.124950669071
 111 / 578 The train loss 0.12470951996
 112 / 578 The train loss 0.124561257502
 113 / 578 The train loss 0.124780501519
 114 / 578 The train loss 0.125063708583
 115 / 578 The train loss 0.125590333059
 116 / 578 The train loss 0.126469904725
 117 / 578 The train loss 0.126945614409
 118 / 578 The train loss 0.127465911691
 119 / 578 The train loss 0.127522840321
 120 / 578 The train loss 0.127776056505
 121 / 578 The train loss 0.128279193966
 122 / 578 The train loss 0.129323907662
 123 / 578 The train loss 0.132032573564
 124 / 578 The train loss 0.137139529328
 125 / 578 The train loss 0.142921335079
 126 / 578 The train loss 0.14790629475
 127 / 578 The train loss 0.152749274989
 128 / 578 The train loss 0.156694831188
 129 / 578 The train loss 0.161578333518
 130 / 578 The train loss 0.165886971906
 131 / 578 The train loss 0.168192761182
 132 / 578 The train loss 0.168887634712
 133 / 578 The train loss 0.168658781972
 134 / 578 The train loss 0.167940146644
 135 / 578 The train loss 0.167494680022
 136 / 578 The train loss 0.168397833737
 137 / 578 The train loss 0.169033166463
 138 / 578 The train loss 0.168813916524
 139 / 578 The train loss 0.168871973253
 140 / 578 The train loss 0.168955322654
 141 / 578 The train loss 0.168968258155
 142 / 578 The train loss 0.168624408996
 143 / 578 The train loss 0.16867667221
 144 / 578 The train loss 0.169391784746
 145 / 578 The train loss 0.169627311344
 146 / 578 The train loss 0.169905512434
 147 / 578 The train loss 0.170576683651
 148 / 578 The train loss 0.172121385597
 149 / 578 The train loss 0.17449463747
 150 / 578 The train loss 0.175256718329
 151 / 578 The train loss 0.175134728219
 152 / 578 The train loss 0.175534185739
 153 / 578 The train loss 0.175867865026
 154 / 578 The train loss 0.175379594165
 155 / 578 The train loss 0.175529677667
 156 / 578 The train loss 0.176324026546
 157 / 578 The train loss 0.177164995776
 158 / 578 The train loss 0.177330737185
 159 / 578 The train loss 0.17740178308
 160 / 578 The train loss 0.177111414756
 161 / 578 The train loss 0.176607974504
 162 / 578 The train loss 0.176306797795
 163 / 578 The train loss 0.176314229643
 164 / 578 The train loss 0.176008254227
 165 / 578 The train loss 0.175339935026
 166 / 578 The train loss 0.174891054973
 167 / 578 The train loss 0.174779372106
 168 / 578 The train loss 0.179405465173
 169 / 578 The train loss 0.182207548567
 170 / 578 The train loss 0.183066512716
 171 / 578 The train loss 0.183391002345
 172 / 578 The train loss 0.1833526012
 173 / 578 The train loss 0.182961452026
 174 / 578 The train loss 0.18283974489
 175 / 578 The train loss 0.182625446644
 176 / 578 The train loss 0.182569972308
 177 / 578 The train loss 0.182052754692
 178 / 578 The train loss 0.181767323823
 179 / 578 The train loss 0.181730065292
 180 / 578 The train loss 0.18190740424
 181 / 578 The train loss 0.181702725245
 182 / 578 The train loss 0.181197493189
 183 / 578 The train loss 0.180871446586
 184 / 578 The train loss 0.181426397631
 185 / 578 The train loss 0.181592636802
 186 / 578 The train loss 0.181192426376
 187 / 578 The train loss 0.181720982844
 188 / 578 The train loss 0.183417748209
 189 / 578 The train loss 0.185429863459
 190 / 578 The train loss 0.189396656447
 191 / 578 The train loss 0.191824061031
 192 / 578 The train loss 0.193957053072
 193 / 578 The train loss 0.19760962775
 194 / 578 The train loss 0.200015240671
 195 / 578 The train loss 0.199829155001
 196 / 578 The train loss 0.19924752371
 197 / 578 The train loss 0.198755544024
 198 / 578 The train loss 0.198414381128
 199 / 578 The train loss 0.199119289041
 200 / 578 The train loss 0.200209058695
 201 / 578 The train loss 0.201071424294
 202 / 578 The train loss 0.203190806797
 203 / 578 The train loss 0.20433898133
 204 / 578 The train loss 0.205104205415
 205 / 578 The train loss 0.204781446829
 206 / 578 The train loss 0.204658021649
 207 / 578 The train loss 0.204703035009
 208 / 578 The train loss 0.204525548629
 209 / 578 The train loss 0.204759218142
 210 / 578 The train loss 0.204541008977
 211 / 578 The train loss 0.204218213614
 212 / 578 The train loss 0.203788094455
 213 / 578 The train loss 0.203548948412
 214 / 578 The train loss 0.203720847532
 215 / 578 The train loss 0.203614152262
 216 / 578 The train loss 0.203818511515
 217 / 578 The train loss 0.204478689212
 218 / 578 The train loss 0.20501582354
 219 / 578 The train loss 0.205620080934
 220 / 578 The train loss 0.205633396406
 221 / 578 The train loss 0.205038370847
 222 / 578 The train loss 0.204752675371
 223 / 578 The train loss 0.204883346898
 224 / 578 The train loss 0.205073809771
 225 / 578 The train loss 0.205204782275
 226 / 578 The train loss 0.205688003339
 227 / 578 The train loss 0.207076416064
 228 / 578 The train loss 0.207605923269
 229 / 578 The train loss 0.208242906807
 230 / 578 The train loss 0.20889069508
 231 / 578 The train loss 0.209311213123
 232 / 578 The train loss 0.209801600592
 233 / 578 The train loss 0.210850715817
 234 / 578 The train loss 0.212209882059
 235 / 578 The train loss 0.214905410549
 236 / 578 The train loss 0.217927390161
 237 / 578 The train loss 0.218960910835
 238 / 578 The train loss 0.219357190313
 239 / 578 The train loss 0.219568302492
 240 / 578 The train loss 0.219822494736
 241 / 578 The train loss 0.21999555621
 242 / 578 The train loss 0.220958755456
 243 / 578 The train loss 0.221425660648
 244 / 578 The train loss 0.222049060098
 245 / 578 The train loss 0.223546717046
 246 / 578 The train loss 0.22554013877
 247 / 578 The train loss 0.225123362431
 248 / 578 The train loss 0.224778445246
 249 / 578 The train loss 0.224516235251
 250 / 578 The train loss 0.225177188087
 251 / 578 The train loss 0.226316792486
 252 / 578 The train loss 0.227162362139
 253 / 578 The train loss 0.228565177711
 254 / 578 The train loss 0.230350935835
 255 / 578 The train loss 0.232026886754
 256 / 578 The train loss 0.236027892493
 257 / 578 The train loss 0.238274041596
 258 / 578 The train loss 0.240257059466
 259 / 578 The train loss 0.240944292256
 260 / 578 The train loss 0.242166992266
 261 / 578 The train loss 0.243037295685
 262 / 578 The train loss 0.243435424408
 263 / 578 The train loss 0.24363907825
 264 / 578 The train loss 0.243925208328
 265 / 578 The train loss 0.243941652427
 266 / 578 The train loss 0.24380198753
 267 / 578 The train loss 0.243612313974
 268 / 578 The train loss 0.243039252421
 269 / 578 The train loss 0.243217747385
 270 / 578 The train loss 0.244818271686
 271 / 578 The train loss 0.245931605286
 272 / 578 The train loss 0.246295568413
 273 / 578 The train loss 0.24643308682
 274 / 578 The train loss 0.246472608587
 275 / 578 The train loss 0.246304526427
 276 / 578 The train loss 0.245986586076
 277 / 578 The train loss 0.245393116811
 278 / 578 The train loss 0.244790564623
 279 / 578 The train loss 0.244285685871
 280 / 578 The train loss 0.243776918406
 281 / 578 The train loss 0.243543780243
 282 / 578 The train loss 0.24311884423
 283 / 578 The train loss 0.242639143655
 284 / 578 The train loss 0.242262207959
 285 / 578 The train loss 0.24188068712
 286 / 578 The train loss 0.241632937073
 287 / 578 The train loss 0.241266690003
 288 / 578 The train loss 0.24102416944
 289 / 578 The train loss 0.240614241321
 290 / 578 The train loss 0.2401702639
 291 / 578 The train loss 0.239758740627
 292 / 578 The train loss 0.239450318857
 293 / 578 The train loss 0.239175802094
 294 / 578 The train loss 0.238781597659
 295 / 578 The train loss 0.238301738456
 296 / 578 The train loss 0.237884109553
 297 / 578 The train loss 0.237319501227
 298 / 578 The train loss 0.23690719029
 299 / 578 The train loss 0.236362656687
 300 / 578 The train loss 0.235954779601
 301 / 578 The train loss 0.235790039431
 302 / 578 The train loss 0.235954321591
 303 / 578 The train loss 0.235885834798
 304 / 578 The train loss 0.235884145065
 305 / 578 The train loss 0.235987157475
 306 / 578 The train loss 0.235573394398
 307 / 578 The train loss 0.235324849927
 308 / 578 The train loss 0.235011014861
 309 / 578 The train loss 0.234675374488
 310 / 578 The train loss 0.234659495965
 311 / 578 The train loss 0.235561978698
 312 / 578 The train loss 0.236965960719
 313 / 578 The train loss 0.238205875181
 314 / 578 The train loss 0.238992172362
 315 / 578 The train loss 0.238628948577
 316 / 578 The train loss 0.238186445856
 317 / 578 The train loss 0.238158348199
 318 / 578 The train loss 0.237886811311
 319 / 578 The train loss 0.238096517657
 320 / 578 The train loss 0.238423754941
 321 / 578 The train loss 0.23804413856
 322 / 578 The train loss 0.237630291446
 323 / 578 The train loss 0.237167375967
 324 / 578 The train loss 0.23664409172
 325 / 578 The train loss 0.23611930546
 326 / 578 The train loss 0.235558058132
 327 / 578 The train loss 0.234940094107
 328 / 578 The train loss 0.234341717351
 329 / 578 The train loss 0.233718930874
 330 / 578 The train loss 0.233224853729
 331 / 578 The train loss 0.232690147968
 332 / 578 The train loss 0.232072109135
 333 / 578 The train loss 0.231540248964
 334 / 578 The train loss 0.230970893839
 335 / 578 The train loss 0.230449445526
 336 / 578 The train loss 0.230011649943
 337 / 578 The train loss 0.23002129579
 338 / 578 The train loss 0.230585926364
 339 / 578 The train loss 0.231148402756
 340 / 578 The train loss 0.230917655789
 341 / 578 The train loss 0.230375732028
 342 / 578 The train loss 0.230067422311
 343 / 578 The train loss 0.229660248588
 344 / 578 The train loss 0.229204863302
 345 / 578 The train loss 0.228822372565
 346 / 578 The train loss 0.228469732322
 347 / 578 The train loss 0.228158576894
 348 / 578 The train loss 0.227966623437
 349 / 578 The train loss 0.227980561995
 350 / 578 The train loss 0.22803246757
 351 / 578 The train loss 0.228291692082
 352 / 578 The train loss 0.227996001435
 353 / 578 The train loss 0.227579444238
 354 / 578 The train loss 0.227383848566
 355 / 578 The train loss 0.227582626121
 356 / 578 The train loss 0.227675102832
 357 / 578 The train loss 0.22740782544
 358 / 578 The train loss 0.227406465987
 359 / 578 The train loss 0.227299277427
 360 / 578 The train loss 0.227452767488
 361 / 578 The train loss 0.227733358135
 362 / 578 The train loss 0.228415802381
 363 / 578 The train loss 0.229462308509
 364 / 578 The train loss 0.230042104878
 365 / 578 The train loss 0.230051060054
 366 / 578 The train loss 0.230436455654
 367 / 578 The train loss 0.23015929707
 368 / 578 The train loss 0.229845002023
 369 / 578 The train loss 0.229403151701
 370 / 578 The train loss 0.228913964928
 371 / 578 The train loss 0.228472150995
 372 / 578 The train loss 0.227989423972
 373 / 578 The train loss 0.227718472034
 374 / 578 The train loss 0.227665449958
 375 / 578 The train loss 0.227327401889
 376 / 578 The train loss 0.226795779851
 377 / 578 The train loss 0.226290124296
 378 / 578 The train loss 0.225850576074
 379 / 578 The train loss 0.225410549292
 380 / 578 The train loss 0.224869545826
 381 / 578 The train loss 0.224552464906
 382 / 578 The train loss 0.224324911504
 383 / 578 The train loss 0.224003061635
 384 / 578 The train loss 0.223598666433
 385 / 578 The train loss 0.223161593585
 386 / 578 The train loss 0.222768473099
 387 / 578 The train loss 0.222276189061
 388 / 578 The train loss 0.221832104721
 389 / 578 The train loss 0.221409970566
 390 / 578 The train loss 0.220919291525
 391 / 578 The train loss 0.220549759164
 392 / 578 The train loss 0.220064399557
 393 / 578 The train loss 0.219613958721
 394 / 578 The train loss 0.219092575774
 395 / 578 The train loss 0.218613353011
 396 / 578 The train loss 0.218316835509
 397 / 578 The train loss 0.218098159023
 398 / 578 The train loss 0.217857701599
 399 / 578 The train loss 0.217430360277
 400 / 578 The train loss 0.216991541747
 401 / 578 The train loss 0.216563282277
 402 / 578 The train loss 0.216115813357
 403 / 578 The train loss 0.215729807596
 404 / 578 The train loss 0.215604320701
 405 / 578 The train loss 0.215591697429
 406 / 578 The train loss 0.215752359037
 407 / 578 The train loss 0.21563065504
 408 / 578 The train loss 0.21518187862
 409 / 578 The train loss 0.215837938294
 410 / 578 The train loss 0.216973982658
 411 / 578 The train loss 0.217410321234
 412 / 578 The train loss 0.217752051311
 413 / 578 The train loss 0.217697605434
 414 / 578 The train loss 0.217868187701
 415 / 578 The train loss 0.217934989909
 416 / 578 The train loss 0.217791946047
 417 / 578 The train loss 0.217697158257
 418 / 578 The train loss 0.217693404448
 419 / 578 The train loss 0.217870827166
 420 / 578 The train loss 0.217752174694
 421 / 578 The train loss 0.217361576676
 422 / 578 The train loss 0.216909723231
 423 / 578 The train loss 0.216504068272
 424 / 578 The train loss 0.216128269665
 425 / 578 The train loss 0.215908592499
 426 / 578 The train loss 0.216180836211
 427 / 578 The train loss 0.216412484905
 428 / 578 The train loss 0.21643638017
 429 / 578 The train loss 0.216528899931
 430 / 578 The train loss 0.216974907262
 431 / 578 The train loss 0.21716598103
 432 / 578 The train loss 0.217252778146
 433 / 578 The train loss 0.217707150646
 434 / 578 The train loss 0.218323639716
 435 / 578 The train loss 0.218733097268
 436 / 578 The train loss 0.219482833415
 437 / 578 The train loss 0.21939235992
 438 / 578 The train loss 0.218995391138
 439 / 578 The train loss 0.21860663949
 440 / 578 The train loss 0.218332659874
 441 / 578 The train loss 0.218012512049
 442 / 578 The train loss 0.217687410414
 443 / 578 The train loss 0.21738845658
 444 / 578 The train loss 0.217244616464
 445 / 578 The train loss 0.21774205736
 446 / 578 The train loss 0.218370384664
 447 / 578 The train loss 0.218964588287
 448 / 578 The train loss 0.219033445529
 449 / 578 The train loss 0.218996569645
 450 / 578 The train loss 0.219016052747
 451 / 578 The train loss 0.219173886029
 452 / 578 The train loss 0.219213761354
 453 / 578 The train loss 0.21893349594
 454 / 578 The train loss 0.218573912993
 455 / 578 The train loss 0.218211979098
 456 / 578 The train loss 0.21791597583
 457 / 578 The train loss 0.217701563606
 458 / 578 The train loss 0.21737441762
 459 / 578 The train loss 0.216965150768
 460 / 578 The train loss 0.216554605465
 461 / 578 The train loss 0.216184209691
 462 / 578 The train loss 0.215892501633
 463 / 578 The train loss 0.216842470145
 464 / 578 The train loss 0.217542674431
 465 / 578 The train loss 0.217688147521
 466 / 578 The train loss 0.217527530262
 467 / 578 The train loss 0.217400554582
 468 / 578 The train loss 0.217270446963
 469 / 578 The train loss 0.217197278802
 470 / 578 The train loss 0.217150674703
 471 / 578 The train loss 0.217098743394
 472 / 578 The train loss 0.216944199977
 473 / 578 The train loss 0.216798010645
 474 / 578 The train loss 0.216508080521
 475 / 578 The train loss 0.216240044571
 476 / 578 The train loss 0.216165709886
 477 / 578 The train loss 0.215967054049
 478 / 578 The train loss 0.215692085037
 479 / 578 The train loss 0.215344808418
 480 / 578 The train loss 0.215002511044
 481 / 578 The train loss 0.21476054378
 482 / 578 The train loss 0.21446877176
 483 / 578 The train loss 0.214145489629
 484 / 578 The train loss 0.213796296832
 485 / 578 The train loss 0.21347706287
 486 / 578 The train loss 0.213210255734
 487 / 578 The train loss 0.212928084834
 488 / 578 The train loss 0.21259725744
 489 / 578 The train loss 0.212361837059
 490 / 578 The train loss 0.212314392624
 491 / 578 The train loss 0.212288936716
 492 / 578 The train loss 0.212276956685
 493 / 578 The train loss 0.212324139129
 494 / 578 The train loss 0.212165024235
 495 / 578 The train loss 0.211999310386
 496 / 578 The train loss 0.211936847245
 497 / 578 The train loss 0.212046571389
 498 / 578 The train loss 0.21208664269
 499 / 578 The train loss 0.211982775993
 500 / 578 The train loss 0.211726935415
 501 / 578 The train loss 0.211377219844
 502 / 578 The train loss 0.211010306203
 503 / 578 The train loss 0.210764024346
 504 / 578 The train loss 0.21054515082
 505 / 578 The train loss 0.210242702146
 506 / 578 The train loss 0.209991985577
 507 / 578 The train loss 0.209738162809
 508 / 578 The train loss 0.209457784222
 509 / 578 The train loss 0.209131257991
 510 / 578 The train loss 0.208825926006
 511 / 578 The train loss 0.208536674378
 512 / 578 The train loss 0.208195082565
 513 / 578 The train loss 0.207862479309
 514 / 578 The train loss 0.207488393152
 515 / 578 The train loss 0.207233670551
 516 / 578 The train loss 0.207005662275
 517 / 578 The train loss 0.206868805778
 518 / 578 The train loss 0.206726217759
 519 / 578 The train loss 0.206532476805
 520 / 578 The train loss 0.206497672109
 521 / 578 The train loss 0.206470794241
 522 / 578 The train loss 0.2061719979
 523 / 578 The train loss 0.205904534947
 524 / 578 The train loss 0.205674913543
 525 / 578 The train loss 0.205580079607
 526 / 578 The train loss 0.205438340509
 527 / 578 The train loss 0.205353830803
 528 / 578 The train loss 0.205321839526
 529 / 578 The train loss 0.205369373063
 530 / 578 The train loss 0.205380494353
 531 / 578 The train loss 0.20548819395
 532 / 578 The train loss 0.205765049311
 533 / 578 The train loss 0.205914339692
 534 / 578 The train loss 0.205846076177
 535 / 578 The train loss 0.205695063316
 536 / 578 The train loss 0.2054967975
 537 / 578 The train loss 0.205353121957
 538 / 578 The train loss 0.205173759678
 539 / 578 The train loss 0.20494445511
 540 / 578 The train loss 0.204656424313
 541 / 578 The train loss 0.204487136018
 542 / 578 The train loss 0.204465755342
 543 / 578 The train loss 0.204249175387
 544 / 578 The train loss 0.203966986664
 545 / 578 The train loss 0.203959455022
 546 / 578 The train loss 0.204493165455
 547 / 578 The train loss 0.20504673482
 548 / 578 The train loss 0.206668976234
 549 / 578 The train loss 0.207284117224
 550 / 578 The train loss 0.207522750769
 551 / 578 The train loss 0.207556808007
 552 / 578 The train loss 0.20762042831
 553 / 578 The train loss 0.20786003103
 554 / 578 The train loss 0.207838155304
 555 / 578 The train loss 0.207686429667
 556 / 578 The train loss 0.207632604626
 557 / 578 The train loss 0.207563730009
 558 / 578 The train loss 0.207412198975
 559 / 578 The train loss 0.207246401137
 560 / 578 The train loss 0.207016433737
 561 / 578 The train loss 0.206695310358
 562 / 578 The train loss 0.206468133765
 563 / 578 The train loss 0.206436838841
 564 / 578 The train loss 0.206176330142
 565 / 578 The train loss 0.205848445778
 566 / 578 The train loss 0.205541127439
 567 / 578 The train loss 0.205242770113
 568 / 578 The train loss 0.205002617735
 569 / 578 The train loss 0.204812028812
 570 / 578 The train loss 0.204597426312
 571 / 578 The train loss 0.204480680963
 572 / 578 The train loss 0.2043789004
 573 / 578 The train loss 0.204124537219
 574 / 578 The train loss 0.204024227072
 575 / 578 The train loss 0.203935684939
 576 / 578 The train loss 0.203661290801
 577 / 578 The train loss 0.203580990048
 578 / 578 The train loss 0.203440917767

Starting epoch 18
Validation:
 1 / 30 The valid loss 0.37879756093
 2 / 30 The valid loss 0.362038105726
 3 / 30 The valid loss 0.33437760671
 4 / 30 The valid loss 0.266196955927
 5 / 30 The valid loss 0.224794021994
 6 / 30 The valid loss 0.243225955094
 7 / 30 The valid loss 0.334933317666
 8 / 30 The valid loss 0.475273820106
 9 / 30 The valid loss 0.618078856419
 10 / 30 The valid loss 0.687568129972
 11 / 30 The valid loss 0.719424737109
 12 / 30 The valid loss 0.726316930416
 13 / 30 The valid loss 0.725846888068
 14 / 30 The valid loss 0.72324656296
 15 / 30 The valid loss 0.706983952969
 16 / 30 The valid loss 0.695825741859
 17 / 30 The valid loss 0.69972593403
 18 / 30 The valid loss 0.706842837441
 19 / 30 The valid loss 0.709284846132
 20 / 30 The valid loss 0.708847508766
 21 / 30 The valid loss 0.693959939515
 22 / 30 The valid loss 0.676660783758
 23 / 30 The valid loss 0.66088001835
 24 / 30 The valid loss 0.646734609734
 25 / 30 The valid loss 0.630853565484
 26 / 30 The valid loss 0.618880755196
 27 / 30 The valid loss 0.603583323873
 28 / 30 The valid loss 0.586145951679
 29 / 30 The valid loss 0.570229788921
 30 / 30 The valid loss 0.558606586481

Validation MSE(val_loss): 11.8468315061

Test MSE(test_loss): 1.99327794193
Training
 1 / 578 The train loss 0.260988622904
 2 / 578 The train loss 0.208772286773
 3 / 578 The train loss 0.229863862197
 4 / 578 The train loss 0.272019706666
 5 / 578 The train loss 0.253463691473
 6 / 578 The train loss 0.233172660073
 7 / 578 The train loss 0.205627865025
 8 / 578 The train loss 0.18275324977
 9 / 578 The train loss 0.164566893544
 10 / 578 The train loss 0.151523673162
 11 / 578 The train loss 0.141294942999
 12 / 578 The train loss 0.132897964368
 13 / 578 The train loss 0.128526038275
 14 / 578 The train loss 0.130694451609
 15 / 578 The train loss 0.12963010172
 16 / 578 The train loss 0.127959969454
 17 / 578 The train loss 0.129419083981
 18 / 578 The train loss 0.135126142038
 19 / 578 The train loss 0.137014249438
 20 / 578 The train loss 0.135805200413
 21 / 578 The train loss 0.14154286789
 22 / 578 The train loss 0.145226889375
 23 / 578 The train loss 0.141981986878
 24 / 578 The train loss 0.139674479452
 25 / 578 The train loss 0.136024314314
 26 / 578 The train loss 0.132494270945
 27 / 578 The train loss 0.129440065749
 28 / 578 The train loss 0.125437824455
 29 / 578 The train loss 0.121775029163
 30 / 578 The train loss 0.120030311371
 31 / 578 The train loss 0.118131238847
 32 / 578 The train loss 0.115326890897
 33 / 578 The train loss 0.112634138007
 34 / 578 The train loss 0.110163470511
 35 / 578 The train loss 0.110032326515
 36 / 578 The train loss 0.11278311008
 37 / 578 The train loss 0.114821328203
 38 / 578 The train loss 0.116873891044
 39 / 578 The train loss 0.115938314929
 40 / 578 The train loss 0.114370818343
 41 / 578 The train loss 0.111964564316
 42 / 578 The train loss 0.109664901337
 43 / 578 The train loss 0.107743193374
 44 / 578 The train loss 0.106010995001
 45 / 578 The train loss 0.1045303668
 46 / 578 The train loss 0.102825317248
 47 / 578 The train loss 0.10091039796
 48 / 578 The train loss 0.0993551158269
 49 / 578 The train loss 0.0981144808917
 50 / 578 The train loss 0.0969782320037
 51 / 578 The train loss 0.0959449954258
 52 / 578 The train loss 0.0955569744468
 53 / 578 The train loss 0.0947600026387
 54 / 578 The train loss 0.0942465586726
 55 / 578 The train loss 0.0928907939995
 56 / 578 The train loss 0.0915109684658
 57 / 578 The train loss 0.0906399890225
 58 / 578 The train loss 0.0898879084832
 59 / 578 The train loss 0.0895131580626
 60 / 578 The train loss 0.090740488392
 61 / 578 The train loss 0.0930541283985
 62 / 578 The train loss 0.0952997129981
 63 / 578 The train loss 0.0975031627283
 64 / 578 The train loss 0.0984947003162
 65 / 578 The train loss 0.0990454627201
 66 / 578 The train loss 0.0991852896881
 67 / 578 The train loss 0.0989487748732
 68 / 578 The train loss 0.098485940451
 69 / 578 The train loss 0.0976210303902
 70 / 578 The train loss 0.0970141874227
 71 / 578 The train loss 0.0968983563083
 72 / 578 The train loss 0.0963171556359
 73 / 578 The train loss 0.096227855243
 74 / 578 The train loss 0.101399858595
 75 / 578 The train loss 0.109777278441
 76 / 578 The train loss 0.113469251223
 77 / 578 The train loss 0.114950941501
 78 / 578 The train loss 0.116360023367
 79 / 578 The train loss 0.116843154477
 80 / 578 The train loss 0.11743597904
 81 / 578 The train loss 0.117386415543
 82 / 578 The train loss 0.118817189424
 83 / 578 The train loss 0.119242594847
 84 / 578 The train loss 0.119198371024
 85 / 578 The train loss 0.119257055081
 86 / 578 The train loss 0.119397943742
 87 / 578 The train loss 0.11873571094
 88 / 578 The train loss 0.118220638812
 89 / 578 The train loss 0.120828871636
 90 / 578 The train loss 0.121937423769
 91 / 578 The train loss 0.121108508607
 92 / 578 The train loss 0.120082035205
 93 / 578 The train loss 0.119141408062
 94 / 578 The train loss 0.118353931382
 95 / 578 The train loss 0.118164298517
 96 / 578 The train loss 0.117762260556
 97 / 578 The train loss 0.11749763805
 98 / 578 The train loss 0.117275028157
 99 / 578 The train loss 0.117815700548
 100 / 578 The train loss 0.119573445423
 101 / 578 The train loss 0.120435067877
 102 / 578 The train loss 0.120708007247
 103 / 578 The train loss 0.120494155439
 104 / 578 The train loss 0.120059797352
 105 / 578 The train loss 0.119558470829
 106 / 578 The train loss 0.119012619026
 107 / 578 The train loss 0.118730344427
 108 / 578 The train loss 0.11847669927
 109 / 578 The train loss 0.117906957207
 110 / 578 The train loss 0.1174963759
 111 / 578 The train loss 0.117648480334
 112 / 578 The train loss 0.117678496812
 113 / 578 The train loss 0.117984020827
 114 / 578 The train loss 0.118411346183
 115 / 578 The train loss 0.118598277502
 116 / 578 The train loss 0.119207637537
 117 / 578 The train loss 0.119682437931
 118 / 578 The train loss 0.120441840827
 119 / 578 The train loss 0.120900354677
 120 / 578 The train loss 0.121693081742
 121 / 578 The train loss 0.122921193765
 122 / 578 The train loss 0.124193712176
 123 / 578 The train loss 0.126260873589
 124 / 578 The train loss 0.130509741687
 125 / 578 The train loss 0.134505992733
 126 / 578 The train loss 0.138116758812
 127 / 578 The train loss 0.141745753022
 128 / 578 The train loss 0.146144992519
 129 / 578 The train loss 0.150495260109
 130 / 578 The train loss 0.15573561956
 131 / 578 The train loss 0.159812220941
 132 / 578 The train loss 0.162297904569
 133 / 578 The train loss 0.162830117467
 134 / 578 The train loss 0.162567601908
 135 / 578 The train loss 0.163041210471
 136 / 578 The train loss 0.16354843894
 137 / 578 The train loss 0.163896659287
 138 / 578 The train loss 0.163747800183
 139 / 578 The train loss 0.163673678393
 140 / 578 The train loss 0.163038867359
 141 / 578 The train loss 0.162223692977
 142 / 578 The train loss 0.161515426802
 143 / 578 The train loss 0.161434567599
 144 / 578 The train loss 0.162040297065
 145 / 578 The train loss 0.16234994584
 146 / 578 The train loss 0.16211747278
 147 / 578 The train loss 0.162496615043
 148 / 578 The train loss 0.162976651597
 149 / 578 The train loss 0.164142641464
 150 / 578 The train loss 0.164939362624
 151 / 578 The train loss 0.164477575268
 152 / 578 The train loss 0.164007531595
 153 / 578 The train loss 0.163990708756
 154 / 578 The train loss 0.163708146535
 155 / 578 The train loss 0.163148684962
 156 / 578 The train loss 0.163398891705
 157 / 578 The train loss 0.163217790395
 158 / 578 The train loss 0.162930026668
 159 / 578 The train loss 0.162251713018
 160 / 578 The train loss 0.161586810573
 161 / 578 The train loss 0.161015224631
 162 / 578 The train loss 0.160533881499
 163 / 578 The train loss 0.160355234182
 164 / 578 The train loss 0.160299495606
 165 / 578 The train loss 0.160149242046
 166 / 578 The train loss 0.159803589217
 167 / 578 The train loss 0.160085472847
 168 / 578 The train loss 0.162968151598
 169 / 578 The train loss 0.163497452172
 170 / 578 The train loss 0.163085924193
 171 / 578 The train loss 0.162540415896
 172 / 578 The train loss 0.162095322701
 173 / 578 The train loss 0.161922091365
 174 / 578 The train loss 0.163513154321
 175 / 578 The train loss 0.163558109674
 176 / 578 The train loss 0.163634199496
 177 / 578 The train loss 0.163232418751
 178 / 578 The train loss 0.162949220659
 179 / 578 The train loss 0.16263590609
 180 / 578 The train loss 0.162241557256
 181 / 578 The train loss 0.161868624487
 182 / 578 The train loss 0.161345871201
 183 / 578 The train loss 0.161018570687
 184 / 578 The train loss 0.161768840317
 185 / 578 The train loss 0.162053753733
 186 / 578 The train loss 0.161716139219
 187 / 578 The train loss 0.161555510886
 188 / 578 The train loss 0.161738632777
 189 / 578 The train loss 0.161843981155
 190 / 578 The train loss 0.164026475046
 191 / 578 The train loss 0.166927094952
 192 / 578 The train loss 0.168939742973
 193 / 578 The train loss 0.170687782733
 194 / 578 The train loss 0.172999142155
 195 / 578 The train loss 0.173393815388
 196 / 578 The train loss 0.172813967303
 197 / 578 The train loss 0.172451905315
 198 / 578 The train loss 0.172346802348
 199 / 578 The train loss 0.173627485708
 200 / 578 The train loss 0.17551505161
 201 / 578 The train loss 0.176169242557
 202 / 578 The train loss 0.17826706449
 203 / 578 The train loss 0.179721183501
 204 / 578 The train loss 0.180234215839
 205 / 578 The train loss 0.179867055771
 206 / 578 The train loss 0.179850421172
 207 / 578 The train loss 0.179824361744
 208 / 578 The train loss 0.179629049886
 209 / 578 The train loss 0.180161096525
 210 / 578 The train loss 0.179926570234
 211 / 578 The train loss 0.179626369624
 212 / 578 The train loss 0.179191222568
 213 / 578 The train loss 0.178966878634
 214 / 578 The train loss 0.178888698513
 215 / 578 The train loss 0.179045181128
 216 / 578 The train loss 0.179478018696
 217 / 578 The train loss 0.179562912537
 218 / 578 The train loss 0.179434018923
 219 / 578 The train loss 0.17939479123
 220 / 578 The train loss 0.179201450516
 221 / 578 The train loss 0.178989310386
 222 / 578 The train loss 0.178665471989
 223 / 578 The train loss 0.178483851099
 224 / 578 The train loss 0.178446127114
 225 / 578 The train loss 0.178817938579
 226 / 578 The train loss 0.17947720665
 227 / 578 The train loss 0.180429240693
 228 / 578 The train loss 0.180966122774
 229 / 578 The train loss 0.181408655708
 230 / 578 The train loss 0.181718008945
 231 / 578 The train loss 0.181962620269
 232 / 578 The train loss 0.182226819544
 233 / 578 The train loss 0.18292675581
 234 / 578 The train loss 0.183982397322
 235 / 578 The train loss 0.185478976254
 236 / 578 The train loss 0.185416077986
 237 / 578 The train loss 0.185484912651
 238 / 578 The train loss 0.186055193628
 239 / 578 The train loss 0.186558752827
 240 / 578 The train loss 0.186865092391
 241 / 578 The train loss 0.187006719886
 242 / 578 The train loss 0.187341923198
 243 / 578 The train loss 0.187996372653
 244 / 578 The train loss 0.188643768075
 245 / 578 The train loss 0.18962630509
 246 / 578 The train loss 0.190238878627
 247 / 578 The train loss 0.189884563489
 248 / 578 The train loss 0.189462205767
 249 / 578 The train loss 0.189125130944
 250 / 578 The train loss 0.189462986123
 251 / 578 The train loss 0.190917915924
 252 / 578 The train loss 0.19202890875
 253 / 578 The train loss 0.193072417519
 254 / 578 The train loss 0.194417584845
 255 / 578 The train loss 0.195877889102
 256 / 578 The train loss 0.198201329225
 257 / 578 The train loss 0.200610497593
 258 / 578 The train loss 0.201788818662
 259 / 578 The train loss 0.202531291492
 260 / 578 The train loss 0.203146723038
 261 / 578 The train loss 0.203704212485
 262 / 578 The train loss 0.203844025969
 263 / 578 The train loss 0.203487193357
 264 / 578 The train loss 0.203323704004
 265 / 578 The train loss 0.202838571605
 266 / 578 The train loss 0.20228838706
 267 / 578 The train loss 0.201913127805
 268 / 578 The train loss 0.201971884661
 269 / 578 The train loss 0.202059729679
 270 / 578 The train loss 0.202393634076
 271 / 578 The train loss 0.202461549562
 272 / 578 The train loss 0.202449577268
 273 / 578 The train loss 0.202202960641
 274 / 578 The train loss 0.202042327592
 275 / 578 The train loss 0.202079110372
 276 / 578 The train loss 0.202136090926
 277 / 578 The train loss 0.201907286585
 278 / 578 The train loss 0.201535283028
 279 / 578 The train loss 0.201272147648
 280 / 578 The train loss 0.201677092068
 281 / 578 The train loss 0.202284187831
 282 / 578 The train loss 0.20258757008
 283 / 578 The train loss 0.202571115266
 284 / 578 The train loss 0.202616838512
 285 / 578 The train loss 0.20281949473
 286 / 578 The train loss 0.202542145909
 287 / 578 The train loss 0.202239922436
 288 / 578 The train loss 0.202371202264
 289 / 578 The train loss 0.202389889532
 290 / 578 The train loss 0.202452087926
 291 / 578 The train loss 0.202440186986
 292 / 578 The train loss 0.202416564871
 293 / 578 The train loss 0.202199042463
 294 / 578 The train loss 0.201867586049
 295 / 578 The train loss 0.201531162287
 296 / 578 The train loss 0.201332348157
 297 / 578 The train loss 0.201078682319
 298 / 578 The train loss 0.20065068392
 299 / 578 The train loss 0.200395631984
 300 / 578 The train loss 0.200641250843
 301 / 578 The train loss 0.200943824693
 302 / 578 The train loss 0.201600511771
 303 / 578 The train loss 0.201856776762
 304 / 578 The train loss 0.201808015114
 305 / 578 The train loss 0.201610468826
 306 / 578 The train loss 0.201330087285
 307 / 578 The train loss 0.201049905583
 308 / 578 The train loss 0.200782592696
 309 / 578 The train loss 0.200844453554
 310 / 578 The train loss 0.200929150163
 311 / 578 The train loss 0.201720219701
 312 / 578 The train loss 0.202907131636
 313 / 578 The train loss 0.203965091462
 314 / 578 The train loss 0.205172938701
 315 / 578 The train loss 0.205274773623
 316 / 578 The train loss 0.204969177267
 317 / 578 The train loss 0.204528951809
 318 / 578 The train loss 0.204066096843
 319 / 578 The train loss 0.203633027972
 320 / 578 The train loss 0.203209068856
 321 / 578 The train loss 0.20316944018
 322 / 578 The train loss 0.202900659082
 323 / 578 The train loss 0.202669647684
 324 / 578 The train loss 0.202359612568
 325 / 578 The train loss 0.20200575301
 326 / 578 The train loss 0.20179232909
 327 / 578 The train loss 0.201603103164
 328 / 578 The train loss 0.201203827425
 329 / 578 The train loss 0.200764428901
 330 / 578 The train loss 0.200471838245
 331 / 578 The train loss 0.199985477879
 332 / 578 The train loss 0.199597788321
 333 / 578 The train loss 0.199299958059
 334 / 578 The train loss 0.198881987234
 335 / 578 The train loss 0.19872867011
 336 / 578 The train loss 0.198463258952
 337 / 578 The train loss 0.198126755917
 338 / 578 The train loss 0.197799840024
 339 / 578 The train loss 0.197407659811
 340 / 578 The train loss 0.19716691105
 341 / 578 The train loss 0.197156363587
 342 / 578 The train loss 0.197087571654
 343 / 578 The train loss 0.196811319937
 344 / 578 The train loss 0.196659128309
 345 / 578 The train loss 0.196521400906
 346 / 578 The train loss 0.196294561693
 347 / 578 The train loss 0.19628703317
 348 / 578 The train loss 0.196328963418
 349 / 578 The train loss 0.196361957377
 350 / 578 The train loss 0.196214023421
 351 / 578 The train loss 0.196149892508
 352 / 578 The train loss 0.195907424731
 353 / 578 The train loss 0.195733996164
 354 / 578 The train loss 0.195670814856
 355 / 578 The train loss 0.195745624263
 356 / 578 The train loss 0.195709389349
 357 / 578 The train loss 0.195542708684
 358 / 578 The train loss 0.195321565814
 359 / 578 The train loss 0.195081470443
 360 / 578 The train loss 0.195129998647
 361 / 578 The train loss 0.195216369222
 362 / 578 The train loss 0.1950604555
 363 / 578 The train loss 0.194872599411
 364 / 578 The train loss 0.194812578362
 365 / 578 The train loss 0.194966836618
 366 / 578 The train loss 0.195841502253
 367 / 578 The train loss 0.19600895254
 368 / 578 The train loss 0.195869093718
 369 / 578 The train loss 0.195579548795
 370 / 578 The train loss 0.195284748276
 371 / 578 The train loss 0.194872993642
 372 / 578 The train loss 0.194442229804
 373 / 578 The train loss 0.194145337564
 374 / 578 The train loss 0.193761210538
 375 / 578 The train loss 0.19333040572
 376 / 578 The train loss 0.192848199728
 377 / 578 The train loss 0.192443649028
 378 / 578 The train loss 0.192036099877
 379 / 578 The train loss 0.191563879009
 380 / 578 The train loss 0.191157406985
 381 / 578 The train loss 0.191023242975
 382 / 578 The train loss 0.191235287559
 383 / 578 The train loss 0.191238508859
 384 / 578 The train loss 0.190884212735
 385 / 578 The train loss 0.190511951126
 386 / 578 The train loss 0.190163312292
 387 / 578 The train loss 0.189819350545
 388 / 578 The train loss 0.189380824225
 389 / 578 The train loss 0.188938041225
 390 / 578 The train loss 0.188521121834
 391 / 578 The train loss 0.188144139069
 392 / 578 The train loss 0.187738397672
 393 / 578 The train loss 0.187309936618
 394 / 578 The train loss 0.186920479072
 395 / 578 The train loss 0.186567830115
 396 / 578 The train loss 0.186194314085
 397 / 578 The train loss 0.185844462571
 398 / 578 The train loss 0.185445474713
 399 / 578 The train loss 0.185119089701
 400 / 578 The train loss 0.184924584683
 401 / 578 The train loss 0.185033492558
 402 / 578 The train loss 0.185603601367
 403 / 578 The train loss 0.185824932165
 404 / 578 The train loss 0.185806211287
 405 / 578 The train loss 0.18570144951
 406 / 578 The train loss 0.185645213066
 407 / 578 The train loss 0.185426682325
 408 / 578 The train loss 0.185099665459
 409 / 578 The train loss 0.186351517757
 410 / 578 The train loss 0.18802079502
 411 / 578 The train loss 0.189090468026
 412 / 578 The train loss 0.189980521527
 413 / 578 The train loss 0.189964103118
 414 / 578 The train loss 0.190187575215
 415 / 578 The train loss 0.190034809224
 416 / 578 The train loss 0.1896615976
 417 / 578 The train loss 0.189244117641
 418 / 578 The train loss 0.188938520555
 419 / 578 The train loss 0.188756825113
 420 / 578 The train loss 0.188612676598
 421 / 578 The train loss 0.188298105831
 422 / 578 The train loss 0.187901664887
 423 / 578 The train loss 0.187581050965
 424 / 578 The train loss 0.187241571991
 425 / 578 The train loss 0.186909463484
 426 / 578 The train loss 0.186710389069
 427 / 578 The train loss 0.186470538456
 428 / 578 The train loss 0.186143408677
 429 / 578 The train loss 0.185863622505
 430 / 578 The train loss 0.185653819508
 431 / 578 The train loss 0.185579164139
 432 / 578 The train loss 0.185490430246
 433 / 578 The train loss 0.185789362542
 434 / 578 The train loss 0.186510228771
 435 / 578 The train loss 0.187066662397
 436 / 578 The train loss 0.187588291516
 437 / 578 The train loss 0.18752198697
 438 / 578 The train loss 0.187288572697
 439 / 578 The train loss 0.186966004737
 440 / 578 The train loss 0.1866865725
 441 / 578 The train loss 0.186495280607
 442 / 578 The train loss 0.18650165384
 443 / 578 The train loss 0.186352668746
 444 / 578 The train loss 0.18600514907
 445 / 578 The train loss 0.1857294821
 446 / 578 The train loss 0.185710956263
 447 / 578 The train loss 0.185523967948
 448 / 578 The train loss 0.185340697278
 449 / 578 The train loss 0.185217553629
 450 / 578 The train loss 0.185162024498
 451 / 578 The train loss 0.18539201493
 452 / 578 The train loss 0.185472251963
 453 / 578 The train loss 0.185342434555
 454 / 578 The train loss 0.185188628881
 455 / 578 The train loss 0.185013645249
 456 / 578 The train loss 0.184988021606
 457 / 578 The train loss 0.184915684555
 458 / 578 The train loss 0.184685100553
 459 / 578 The train loss 0.184467797933
 460 / 578 The train loss 0.184264509331
 461 / 578 The train loss 0.183934030477
 462 / 578 The train loss 0.183630379884
 463 / 578 The train loss 0.184546961246
 464 / 578 The train loss 0.184763745878
 465 / 578 The train loss 0.18458610314
 466 / 578 The train loss 0.184338578299
 467 / 578 The train loss 0.184011899653
 468 / 578 The train loss 0.183788654832
 469 / 578 The train loss 0.183685712214
 470 / 578 The train loss 0.183509516399
 471 / 578 The train loss 0.183317505693
 472 / 578 The train loss 0.183137248097
 473 / 578 The train loss 0.182917326547
 474 / 578 The train loss 0.182665190899
 475 / 578 The train loss 0.182321828925
 476 / 578 The train loss 0.182100786201
 477 / 578 The train loss 0.181926046035
 478 / 578 The train loss 0.181623980738
 479 / 578 The train loss 0.181298719544
 480 / 578 The train loss 0.180961638437
 481 / 578 The train loss 0.180646077538
 482 / 578 The train loss 0.180358799828
 483 / 578 The train loss 0.180082150934
 484 / 578 The train loss 0.17984141496
 485 / 578 The train loss 0.179568631459
 486 / 578 The train loss 0.179306295516
 487 / 578 The train loss 0.179094115945
 488 / 578 The train loss 0.178843947597
 489 / 578 The train loss 0.178626007475
 490 / 578 The train loss 0.178532800455
 491 / 578 The train loss 0.178525569251
 492 / 578 The train loss 0.178519203392
 493 / 578 The train loss 0.178492390099
 494 / 578 The train loss 0.178326317878
 495 / 578 The train loss 0.178148152442
 496 / 578 The train loss 0.178065337823
 497 / 578 The train loss 0.178122940389
 498 / 578 The train loss 0.178023363912
 499 / 578 The train loss 0.17782543641
 500 / 578 The train loss 0.177559744623
 501 / 578 The train loss 0.177299152253
 502 / 578 The train loss 0.177039683433
 503 / 578 The train loss 0.176826191163
 504 / 578 The train loss 0.176609651219
 505 / 578 The train loss 0.176415721618
 506 / 578 The train loss 0.176212179305
 507 / 578 The train loss 0.176006574988
 508 / 578 The train loss 0.175790678171
 509 / 578 The train loss 0.175531217148
 510 / 578 The train loss 0.175270581308
 511 / 578 The train loss 0.175012515078
 512 / 578 The train loss 0.174729581016
 513 / 578 The train loss 0.174424227759
 514 / 578 The train loss 0.174114120138
 515 / 578 The train loss 0.173817389719
 516 / 578 The train loss 0.173706390029
 517 / 578 The train loss 0.17372040068
 518 / 578 The train loss 0.173706008668
 519 / 578 The train loss 0.173532027392
 520 / 578 The train loss 0.173342701384
 521 / 578 The train loss 0.173182198381
 522 / 578 The train loss 0.172905539102
 523 / 578 The train loss 0.172675559671
 524 / 578 The train loss 0.172536426075
 525 / 578 The train loss 0.172318650094
 526 / 578 The train loss 0.172116997924
 527 / 578 The train loss 0.172006192827
 528 / 578 The train loss 0.17184185799
 529 / 578 The train loss 0.171745222822
 530 / 578 The train loss 0.171654246403
 531 / 578 The train loss 0.171548040889
 532 / 578 The train loss 0.1714996041
 533 / 578 The train loss 0.171380869618
 534 / 578 The train loss 0.1712650049
 535 / 578 The train loss 0.171145574101
 536 / 578 The train loss 0.170980395756
 537 / 578 The train loss 0.17083830992
 538 / 578 The train loss 0.17085637783
 539 / 578 The train loss 0.170851724316
 540 / 578 The train loss 0.170687428303
 541 / 578 The train loss 0.170626901923
 542 / 578 The train loss 0.170644814808
 543 / 578 The train loss 0.170500338894
 544 / 578 The train loss 0.17026069572
 545 / 578 The train loss 0.170050557293
 546 / 578 The train loss 0.170227315638
 547 / 578 The train loss 0.1705330714
 548 / 578 The train loss 0.171602820848
 549 / 578 The train loss 0.172191936971
 550 / 578 The train loss 0.172385062325
 551 / 578 The train loss 0.172323294234
 552 / 578 The train loss 0.172143408023
 553 / 578 The train loss 0.171931936326
 554 / 578 The train loss 0.171721262642
 555 / 578 The train loss 0.171627562587
 556 / 578 The train loss 0.171674400427
 557 / 578 The train loss 0.171551652901
 558 / 578 The train loss 0.171444814272
 559 / 578 The train loss 0.171244721993
 560 / 578 The train loss 0.171045686956
 561 / 578 The train loss 0.170840993192
 562 / 578 The train loss 0.170588472401
 563 / 578 The train loss 0.170459080097
 564 / 578 The train loss 0.170213128698
 565 / 578 The train loss 0.170049703741
 566 / 578 The train loss 0.169919214879
 567 / 578 The train loss 0.169887607927
 568 / 578 The train loss 0.169825940904
 569 / 578 The train loss 0.169763280134
 570 / 578 The train loss 0.169740328062
 571 / 578 The train loss 0.169528601062
 572 / 578 The train loss 0.169305655076
 573 / 578 The train loss 0.169224238872
 574 / 578 The train loss 0.169089255407
 575 / 578 The train loss 0.169225878741
 576 / 578 The train loss 0.16898681973
 577 / 578 The train loss 0.168725681457
 578 / 578 The train loss 0.168475877777

Starting epoch 19
Validation:
 1 / 30 The valid loss 0.393577337265
 2 / 30 The valid loss 0.397113829851
 3 / 30 The valid loss 0.37603037556
 4 / 30 The valid loss 0.303563499823
 5 / 30 The valid loss 0.248843989521
 6 / 30 The valid loss 0.244785980011
 7 / 30 The valid loss 0.30897199735
 8 / 30 The valid loss 0.424622487742
 9 / 30 The valid loss 0.550123291297
 10 / 30 The valid loss 0.609582779184
 11 / 30 The valid loss 0.669435271147
 12 / 30 The valid loss 0.709297605169
 13 / 30 The valid loss 0.736754351224
 14 / 30 The valid loss 0.75874156425
 15 / 30 The valid loss 0.754378909121
 16 / 30 The valid loss 0.752829356352
 17 / 30 The valid loss 0.767296344261
 18 / 30 The valid loss 0.778991158017
 19 / 30 The valid loss 0.788201860887
 20 / 30 The valid loss 0.792214180715
 21 / 30 The valid loss 0.780035610887
 22 / 30 The valid loss 0.764398730783
 23 / 30 The valid loss 0.751558160166
 24 / 30 The valid loss 0.740251342611
 25 / 30 The valid loss 0.723737756163
 26 / 30 The valid loss 0.710944385626
 27 / 30 The valid loss 0.693705798989
 28 / 30 The valid loss 0.673960500663
 29 / 30 The valid loss 0.655496347182
 30 / 30 The valid loss 0.640987485275

从上面可以看出,测试集表现最好的为:

Starting epoch 18

Validation MSE(val_loss): 11.8468315061

Test MSE(test_loss): 1.99327794193

分析:CNN+RNN seq2seq model表现不错,但反复实验几次后,发现模型很容易过拟合,表现很不稳定。究其原因还是训练数据太少,作为选做部分,将留待后面进一步优化。


Part 4: 选择最佳模型并生成结果视频


说明

  • 不使用选做部分CNN+RNN seq2seq model生成测试视频,留待后续完成


1.使用时间序列图对比各模型

In [7]:
### <1> NVIDIA end-to-end model:Benchmark (基准模型)

# 载入模型
from keras.models import load_model
import preprocess_data
import matplotlib.pyplot as plt

# 载入模型
model = load_model('./models/nvidia_model.h5')   

#载入测试数据
test_imgs, test_wheels = preprocess_data.load_data('test')
test_imgs = preprocess_data.nomorlize_image(test_imgs)

#用模型生成预测转向角度
predicted_wheels = model.predict(test_imgs, batch_size=128, verbose=0)

#生成时间序列图
plt.figure
plt.title('Benchmark NVIDIA end-to-end model')
plt.plot(predicted_wheels)
plt.plot(test_wheels)
plt.ylabel('Steering angle', fontsize=11)
plt.xlabel('Frame counts', fontsize=11)
plt.legend(['Predicted Wheels', 'Ground Truth Wheels'], loc='upper right')
plt.xlim((0,2700))
plt.grid()
plt.savefig('./images/img/1_benchmark.png', dpi=300)
plt.show()
load data start!
The epoch10 mkv is processing
loading data filished!
In [8]:
### <2> NVIDIA end-to-end model:Refined Model (改进模型)

# 载入模型
model = load_model('./models/nvidia_refined_model.h5')   

#用模型生成预测转向角度
predicted_wheels = model.predict(test_imgs, batch_size=128, verbose=0)

#生成时间序列图
plt.figure
plt.title('Refined NVIDIA end-to-end Model')
plt.plot(predicted_wheels)
plt.plot(test_wheels)
plt.ylabel('Steering angle', fontsize=11)
plt.xlabel('Frame counts', fontsize=11)
plt.legend(['Predicted Wheels', 'Ground Truth Wheels'], loc='upper right')
plt.xlim((0,2700))
plt.grid()
plt.savefig('./images/img/2_refined.png', dpi=300)
plt.show()
In [9]:
### <3> NVIDIA end-to-end model:Refined Model + 转向角度数据增加

# 载入模型
model = load_model('./models/nvidia_ra_model.h5')   

#用模型生成预测转向角度
predicted_wheels = model.predict(test_imgs, batch_size=128, verbose=0)

#生成时间序列图
plt.figure
plt.title('Refined NVIDIA end-to-end Plus data Model')
plt.plot(predicted_wheels)
plt.plot(test_wheels)
plt.ylabel('Steering angle', fontsize=11)
plt.xlabel('Frame counts', fontsize=11)
plt.legend(['Predicted Wheels', 'Ground Truth Wheels'], loc='upper right')
plt.xlim((0,2700))
plt.grid()
plt.savefig('./images/img/3_ra.png', dpi=300)
plt.show()
In [10]:
### <4> VGG16 + Nvidia model:notop + top2 bolocks

# 载入模型
model = load_model('./models/vgg16_model.h5')   

#用模型生成预测转向角度
predicted_wheels = model.predict(test_imgs, batch_size=128, verbose=0)

#生成时间序列图
plt.figure
plt.title('VGG16 + Nvidia model notop + top2 bolocks')
plt.plot(predicted_wheels)
plt.plot(test_wheels)
plt.ylabel('Steering angle', fontsize=11)
plt.xlabel('Frame counts', fontsize=11)
plt.legend(['Predicted Wheels', 'Ground Truth Wheels'], loc='upper right')
plt.xlim((0,2700))
plt.grid()
plt.savefig('./images/img/4_vgg16.png', dpi=300)
plt.show()
In [11]:
### <5> VGG16 + Nvidia model:notop + top2 bolocks + 转向角度数据增加

# 载入模型
model = load_model('./models/vgg16_add_model.h5')   

#用模型生成预测转向角度
predicted_wheels = model.predict(test_imgs, batch_size=128, verbose=0)

#生成时间序列图
plt.figure
plt.title('VGG16 + Nvidia model notop + top2 bolocks plus data')
plt.plot(predicted_wheels)
plt.plot(test_wheels)
plt.ylabel('Steering angle', fontsize=11)
plt.xlabel('Frame counts', fontsize=11)
plt.legend(['Predicted Wheels', 'Ground Truth Wheels'], loc='upper right')
plt.xlim((0,2700))
plt.grid()
plt.savefig('./images/img/5_vgg16ad.png', dpi=300)
plt.show()

分析:从上面各模型的时间序列图可以看出,后两个VGG16+Nvidia model 明显比前三个Nvidia model输出的角度更平滑和贴合原始方向角度,同时从Part3各模型的'test loss'也可以看出后两个模型更出色。所以应该选用VGG16+Nvidia model,虽然'VGG16 + Nvidia model:notop + top2 bolocks'比'VGG16 + Nvidia model:notop + top2 bolocks + 转向角度数据增加'输出的'test loss'要小,但从图中可以看出,后者明显比前者的图形走势更接近于原始的数据走势,所以个人认为最佳的模型应该为最后一个模型'VGG16 + Nvidia model:notop + top2 bolocks + 转向角度数据增加'。


2.生成结果视频

策略

  • 1.修改./params.py文件中图片宽,高为'80'。
  • 2.在./run.py文件中增加‘nomorlize_image()’函数,并在'img_pre_process()'中调用。
  • 3.将./utils.py中'get_model'修改为载入vgg16_add_model.h5'和'vgg16_add_model.json'(405和406行)。运行'python run.py'命令生成视频
  • 5.生成视频保存于./output文件夹中'epoch10_human_machine.mp4'文件

说明:首次运行时出现两个错误:1.需要下载'openh264-1.6.0-win64msvc.dll'文件放入项目文件根目录下(已解决)。2.修改./utils中'mkv_to_mp4()'函数适用于Windows 10系统,具体见./utils文件(已解决)

In [1]:
### 输出视频
from IPython.display import HTML
output = './output/epoch10_human_machine.mp4'

HTML("""
<video width="960" height="540" controls>
  <source src="{0}">
</video>
""".format(output))
Out[1]:
In [ ]: